datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
NekoNya99/pokemondb | ---
license: unknown
---
|
AdapterOcean/code_instructions_standardized_cluster_3_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 33990857
num_examples: 10568
download_size: 18250822
dataset_size: 33990857
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_3_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zyerrr/images | ---
license: openrail
---
|
open-llm-leaderboard/details_Inv__Konstanta-V3-BetaFlavour-7B | ---
pretty_name: Evaluation run of Inv/Konstanta-V3-BetaFlavour-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Inv/Konstanta-V3-BetaFlavour-7B](https://huggingface.co/Inv/Konstanta-V3-BetaFlavour-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Inv__Konstanta-V3-BetaFlavour-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T00:52:07.659928](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__Konstanta-V3-BetaFlavour-7B/blob/main/results_2024-03-10T00-52-07.659928.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6181847903415443,\n\
\ \"acc_stderr\": 0.03300581002480976,\n \"acc_norm\": 0.6193449608180348,\n\
\ \"acc_norm_stderr\": 0.03368163016889217,\n \"mc1\": 0.5862913096695227,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.729182943288448,\n\
\ \"mc2_stderr\": 0.01463892425987301\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.658703071672355,\n \"acc_stderr\": 0.01385583128749772,\n\
\ \"acc_norm\": 0.681740614334471,\n \"acc_norm_stderr\": 0.013611993916971453\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6781517625970922,\n\
\ \"acc_stderr\": 0.004662303395239621,\n \"acc_norm\": 0.8687512447719578,\n\
\ \"acc_norm_stderr\": 0.003369821004762251\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.037038511930995215,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.037038511930995215\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n\
\ \"acc_stderr\": 0.028129112709165904,\n \"acc_norm\": 0.5741935483870968,\n\
\ \"acc_norm_stderr\": 0.028129112709165904\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945627,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945627\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723886,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723886\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.02469721693087894,\n \
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.02469721693087894\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010347,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010347\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.014283378044296418,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.014283378044296418\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n\
\ \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n\
\ \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.02577311116963045,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.02577311116963045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n\
\ \"acc_stderr\": 0.012683972513598816,\n \"acc_norm\": 0.44198174706649285,\n\
\ \"acc_norm_stderr\": 0.012683972513598816\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411952,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411952\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.019291961895066382,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.019291961895066382\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.582089552238806,\n\
\ \"acc_stderr\": 0.034875586404620636,\n \"acc_norm\": 0.582089552238806,\n\
\ \"acc_norm_stderr\": 0.034875586404620636\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5862913096695227,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.729182943288448,\n\
\ \"mc2_stderr\": 0.01463892425987301\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.010959716435242912\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5686125852918877,\n \
\ \"acc_stderr\": 0.013642195352511563\n }\n}\n```"
repo_url: https://huggingface.co/Inv/Konstanta-V3-BetaFlavour-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|winogrande|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T00-52-07.659928.parquet'
- config_name: results
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- results_2024-03-10T00-52-07.659928.parquet
- split: latest
path:
- results_2024-03-10T00-52-07.659928.parquet
---
# Dataset Card for Evaluation run of Inv/Konstanta-V3-BetaFlavour-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Inv/Konstanta-V3-BetaFlavour-7B](https://huggingface.co/Inv/Konstanta-V3-BetaFlavour-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Inv__Konstanta-V3-BetaFlavour-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T00:52:07.659928](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__Konstanta-V3-BetaFlavour-7B/blob/main/results_2024-03-10T00-52-07.659928.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6181847903415443,
"acc_stderr": 0.03300581002480976,
"acc_norm": 0.6193449608180348,
"acc_norm_stderr": 0.03368163016889217,
"mc1": 0.5862913096695227,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.729182943288448,
"mc2_stderr": 0.01463892425987301
},
"harness|arc:challenge|25": {
"acc": 0.658703071672355,
"acc_stderr": 0.01385583128749772,
"acc_norm": 0.681740614334471,
"acc_norm_stderr": 0.013611993916971453
},
"harness|hellaswag|10": {
"acc": 0.6781517625970922,
"acc_stderr": 0.004662303395239621,
"acc_norm": 0.8687512447719578,
"acc_norm_stderr": 0.003369821004762251
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.0373852067611967,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.0373852067611967
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.037038511930995215,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.037038511930995215
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165904,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165904
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723886,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723886
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.02469721693087894,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.02469721693087894
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010347,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010347
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467618,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467618
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296418,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.016495400635820084,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.016495400635820084
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.02577311116963045,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.02577311116963045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44198174706649285,
"acc_stderr": 0.012683972513598816,
"acc_norm": 0.44198174706649285,
"acc_norm_stderr": 0.012683972513598816
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.019291961895066382,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.019291961895066382
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.582089552238806,
"acc_stderr": 0.034875586404620636,
"acc_norm": 0.582089552238806,
"acc_norm_stderr": 0.034875586404620636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5862913096695227,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.729182943288448,
"mc2_stderr": 0.01463892425987301
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.010959716435242912
},
"harness|gsm8k|5": {
"acc": 0.5686125852918877,
"acc_stderr": 0.013642195352511563
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
iamkaikai/FluentUI-ART | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 8219353.0
num_examples: 391
download_size: 7746230
dataset_size: 8219353.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "FluentUI-ART"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
natnitaract/kaggel-llm-science-exam-2023-RAG | ---
license: apache-2.0
task_categories:
- multiple-choice
--- |
lucifertrj/AnimeQuotes | ---
license: apache-2.0
---
|
tinyBenchmarks/tinyGSM8k | ---
dataset_info:
config_name: main
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_formatted
dtype: string
splits:
- name: train
num_bytes: 27470490
num_examples: 7473
- name: test
num_bytes: 357642
num_examples: 100
download_size: 5523427
dataset_size: 27828132
configs:
- config_name: main
data_files:
- split: train
path: main/train-*
- split: test
path: main/test-*
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
multilinguality:
- monolingual
size_categories:
- n<1K
source_datasets:
- gsm8k
task_categories:
- text2text-generation
task_ids: []
pretty_name: tinyGSM8k
tags:
- math-word-problems
---
# tinyGSM8K
Welcome to tinyGSM8K! This dataset serves as a concise version of the [GSM8K](https://huggingface.co/datasets/gsm8k) dataset, offering a subset of 100 data points selected from the original compilation.
tinyGSM8K is designed to enable users to efficiently estimate the performance of a large language model (LLM) with reduced dataset size, saving computational resources
while maintaining the essence of the GSM8K evaluation.
## Features
- **Compact Dataset:** With only 100 data points, tinyGSM8K provides a swift and efficient way to evaluate your LLM's performance against a benchmark set, maintaining the essence of the original GSM8K dataset.
- **Compatibility:** tinyGSM8K is compatible with evaluation using the [lm evaluation harness](https://github.com/EleutherAI/lm-evaluation-harness/), but can also be integrated into your custom pipeline. See below for more details.
## Model Evaluation
Users looking to evaluate a new model with tinyGSM8K can use the [lm evaluation harness (v0.4.1 or later)](https://github.com/EleutherAI/lm-evaluation-harness/).
Simply replace `dataset_path: gsm8k` with `dataset_path: tinyBenchmarks/tinyGSM8K` in the file `lm-evaluation-harness/lm_eval/tasks/gsm8k/gsm8k.yaml`
and run your evaluation harness as usual, using the `--log_samples` argument:
```shell
lm_eval --model hf --model_args pretrained="<your-model>" --tasks=gsm8k --batch_size=1 --num_fewshot=5 --output_path=<output_path> --log_samples
```
Alternatively, the tinyGSM8K can be integrated into any other pipeline by downloading the data via
```python
from datasets import load_dataset
tiny_data = load_dataset('tinyBenchmarks/tinyGSM8K', 'main')['test']
```
Now, `tiny_data` contains the 100 subsampled data points with the same features as the original dataset, as well as an additional field containing the preformatted data points.
The preformatted data points follow the formatting used in the [open llm leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) including the respective in-context examples.
Please be aware that evaluating on multiple GPUs can change the order of outputs in the lm evaluation harness.
Ordering your score vector following the original order in tinyGSM8K will be necessary to use the tinyBenchmarks library.
When using the lm evaluation harness, you can then estimate your LLM's performance using the following code. First, ensure you have the tinyBenchmarks package installed:
```shell
pip install git+https://github.com/felipemaiapolo/tinyBenchmarks
```
Then, use the code snippet below for the evaluation:
```python
import numpy as np
import tinyBenchmarks as tb
### Score vector
y = # your original score vector
### Parameters
benchmark = 'gsm8k'
### Evaluation
tb.evaluate(y, benchmark)
```
This process will help you estimate the performance of your LLM against the tinyGSM8K dataset, providing a streamlined approach to benchmarking.
For more detailed instructions on evaluating new models and computing scores, please refer to the comprehensive guides available at [lm evaluation harness](https://github.com/EleutherAI/lm-evaluation-harness/) and [tinyBenchmarks GitHub](https://github.com/felipemaiapolo/tinyBenchmarks).
Happy benchmarking!
## More tinyBenchmarks
**Open LLM leaderboard**:
[tiny MMLU](https://huggingface.co/datasets/tinyBenchmarks/tinyMMLU),
[tiny Arc-Challenge](https://huggingface.co/datasets/tinyBenchmarks/tinyAI2_arc),
[tiny Winogrande](https://huggingface.co/datasets/tinyBenchmarks/tinyWinogrande),
[tiny Hellaswag](https://huggingface.co/datasets/tinyBenchmarks/tinyHellaswag),
[tiny TruthfulQA](https://huggingface.co/datasets/tinyBenchmarks/tinyTruthfulQA),
**AlpacaEval**:
[tiny AlpacaEval](https://huggingface.co/datasets/tinyBenchmarks/tinyAlpacaEval)
**HELM-lite**:
_work-in-progress_
## Citation
@article{polo2024tinybenchmarks,
title={tinyBenchmarks: evaluating LLMs with fewer examples},
author={Felipe Maia Polo and Lucas Weber and Leshem Choshen and Yuekai Sun and Gongjun Xu and Mikhail Yurochkin},
year={2024},
eprint={2402.14992},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@article{cobbe2021gsm8k,
title={Training Verifiers to Solve Math Word Problems},
author={Cobbe, Karl and Kosaraju, Vineet and Bavarian, Mohammad and Chen, Mark and Jun, Heewoo and Kaiser, Lukasz and Plappert, Matthias and Tworek, Jerry and Hilton, Jacob and Nakano, Reiichiro and Hesse, Christopher and Schulman, John},
journal={arXiv preprint arXiv:2110.14168},
year={2021}
} |
liuyanchen1015/MULTI_VALUE_cola_linking_relcl | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 6973
num_examples: 67
- name: test
num_bytes: 5839
num_examples: 56
- name: train
num_bytes: 10542
num_examples: 109
download_size: 17172
dataset_size: 23354
---
# Dataset Card for "MULTI_VALUE_cola_linking_relcl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
delphi-suite/v0-next-logprobs-llama2-6.4m | ---
dataset_info:
features:
- name: logprobs
sequence: float64
splits:
- name: validation
num_bytes: 45818277
num_examples: 10982
download_size: 37790807
dataset_size: 45818277
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_revolutionarybukhari__Llama-2-7b-chat-finetune-AUTOMATE | ---
pretty_name: Evaluation run of revolutionarybukhari/Llama-2-7b-chat-finetune-AUTOMATE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [revolutionarybukhari/Llama-2-7b-chat-finetune-AUTOMATE](https://huggingface.co/revolutionarybukhari/Llama-2-7b-chat-finetune-AUTOMATE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_revolutionarybukhari__Llama-2-7b-chat-finetune-AUTOMATE_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-12T17:51:35.598056](https://huggingface.co/datasets/open-llm-leaderboard/details_revolutionarybukhari__Llama-2-7b-chat-finetune-AUTOMATE_public/blob/main/results_2023-11-12T17-51-35.598056.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48664997345720373,\n\
\ \"acc_stderr\": 0.03427289794847252,\n \"acc_norm\": 0.4932299888431757,\n\
\ \"acc_norm_stderr\": 0.03508094254293674,\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.015846315101394812,\n \"mc2\": 0.44729919889234016,\n\
\ \"mc2_stderr\": 0.015286276115878357,\n \"em\": 0.010906040268456376,\n\
\ \"em_stderr\": 0.0010636334198498001,\n \"f1\": 0.06768770973154396,\n\
\ \"f1_stderr\": 0.0017077194500790263\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4906143344709898,\n \"acc_stderr\": 0.014608816322065,\n\
\ \"acc_norm\": 0.5307167235494881,\n \"acc_norm_stderr\": 0.014583792546304037\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5622385978888668,\n\
\ \"acc_stderr\": 0.004950973231188739,\n \"acc_norm\": 0.7559251145190201,\n\
\ \"acc_norm_stderr\": 0.004286594977390899\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.539622641509434,\n \"acc_stderr\": 0.030676096599389184,\n\
\ \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.030676096599389184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.4046242774566474,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.045595221419582166,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.045595221419582166\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924314,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924314\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.532258064516129,\n \"acc_stderr\": 0.028384747788813332,\n \"\
acc_norm\": 0.532258064516129,\n \"acc_norm_stderr\": 0.028384747788813332\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35960591133004927,\n \"acc_stderr\": 0.03376458246509566,\n \"\
acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.03376458246509566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5696969696969697,\n \"acc_stderr\": 0.03866225962879077,\n\
\ \"acc_norm\": 0.5696969696969697,\n \"acc_norm_stderr\": 0.03866225962879077\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7098445595854922,\n \"acc_stderr\": 0.03275264467791516,\n\
\ \"acc_norm\": 0.7098445595854922,\n \"acc_norm_stderr\": 0.03275264467791516\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4230769230769231,\n \"acc_stderr\": 0.02504919787604234,\n \
\ \"acc_norm\": 0.4230769230769231,\n \"acc_norm_stderr\": 0.02504919787604234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4369747899159664,\n \"acc_stderr\": 0.03221943636566196,\n \
\ \"acc_norm\": 0.4369747899159664,\n \"acc_norm_stderr\": 0.03221943636566196\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.671559633027523,\n\
\ \"acc_stderr\": 0.02013590279729841,\n \"acc_norm\": 0.671559633027523,\n\
\ \"acc_norm_stderr\": 0.02013590279729841\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3287037037037037,\n \"acc_stderr\": 0.032036140846700596,\n\
\ \"acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.032036140846700596\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6813725490196079,\n \"acc_stderr\": 0.032702871814820796,\n \"\
acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.032702871814820796\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6708860759493671,\n \"acc_stderr\": 0.03058732629470237,\n \
\ \"acc_norm\": 0.6708860759493671,\n \"acc_norm_stderr\": 0.03058732629470237\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04712821257426769,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04712821257426769\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5521472392638037,\n \"acc_stderr\": 0.03906947479456606,\n\
\ \"acc_norm\": 0.5521472392638037,\n \"acc_norm_stderr\": 0.03906947479456606\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.02934311479809446,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.02934311479809446\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6704980842911877,\n\
\ \"acc_stderr\": 0.01680832226174046,\n \"acc_norm\": 0.6704980842911877,\n\
\ \"acc_norm_stderr\": 0.01680832226174046\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382875,\n\
\ \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2223463687150838,\n\
\ \"acc_stderr\": 0.013907189208156881,\n \"acc_norm\": 0.2223463687150838,\n\
\ \"acc_norm_stderr\": 0.013907189208156881\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.028624412550167958,\n\
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.028624412550167958\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.572347266881029,\n\
\ \"acc_stderr\": 0.02809924077580956,\n \"acc_norm\": 0.572347266881029,\n\
\ \"acc_norm_stderr\": 0.02809924077580956\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.027513747284379428,\n\
\ \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.027513747284379428\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.02883892147125146,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.02883892147125146\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34615384615384615,\n\
\ \"acc_stderr\": 0.012150699768228556,\n \"acc_norm\": 0.34615384615384615,\n\
\ \"acc_norm_stderr\": 0.012150699768228556\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626916,\n \
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626916\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5306122448979592,\n \"acc_stderr\": 0.031949171367580624,\n\
\ \"acc_norm\": 0.5306122448979592,\n \"acc_norm_stderr\": 0.031949171367580624\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\
\ \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n\
\ \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.034240429246915824,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.034240429246915824\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.015846315101394812,\n \"mc2\": 0.44729919889234016,\n\
\ \"mc2_stderr\": 0.015286276115878357\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7324388318863457,\n \"acc_stderr\": 0.01244171845689301\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.010906040268456376,\n \
\ \"em_stderr\": 0.0010636334198498001,\n \"f1\": 0.06768770973154396,\n\
\ \"f1_stderr\": 0.0017077194500790263\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.08642911296436695,\n \"acc_stderr\": 0.007740044337103787\n\
\ }\n}\n```"
repo_url: https://huggingface.co/revolutionarybukhari/Llama-2-7b-chat-finetune-AUTOMATE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|arc:challenge|25_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|drop|3_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|gsm8k|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hellaswag|10_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-12T17-51-35.598056.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-12T17-51-35.598056.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- '**/details_harness|winogrande|5_2023-11-12T17-51-35.598056.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-12T17-51-35.598056.parquet'
- config_name: results
data_files:
- split: 2023_11_12T17_51_35.598056
path:
- results_2023-11-12T17-51-35.598056.parquet
- split: latest
path:
- results_2023-11-12T17-51-35.598056.parquet
---
# Dataset Card for Evaluation run of revolutionarybukhari/Llama-2-7b-chat-finetune-AUTOMATE
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/revolutionarybukhari/Llama-2-7b-chat-finetune-AUTOMATE
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [revolutionarybukhari/Llama-2-7b-chat-finetune-AUTOMATE](https://huggingface.co/revolutionarybukhari/Llama-2-7b-chat-finetune-AUTOMATE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_revolutionarybukhari__Llama-2-7b-chat-finetune-AUTOMATE_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-12T17:51:35.598056](https://huggingface.co/datasets/open-llm-leaderboard/details_revolutionarybukhari__Llama-2-7b-chat-finetune-AUTOMATE_public/blob/main/results_2023-11-12T17-51-35.598056.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48664997345720373,
"acc_stderr": 0.03427289794847252,
"acc_norm": 0.4932299888431757,
"acc_norm_stderr": 0.03508094254293674,
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394812,
"mc2": 0.44729919889234016,
"mc2_stderr": 0.015286276115878357,
"em": 0.010906040268456376,
"em_stderr": 0.0010636334198498001,
"f1": 0.06768770973154396,
"f1_stderr": 0.0017077194500790263
},
"harness|arc:challenge|25": {
"acc": 0.4906143344709898,
"acc_stderr": 0.014608816322065,
"acc_norm": 0.5307167235494881,
"acc_norm_stderr": 0.014583792546304037
},
"harness|hellaswag|10": {
"acc": 0.5622385978888668,
"acc_stderr": 0.004950973231188739,
"acc_norm": 0.7559251145190201,
"acc_norm_stderr": 0.004286594977390899
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.539622641509434,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.539622641509434,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.045595221419582166,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.045595221419582166
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963286,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924314,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924314
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.028384747788813332,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.028384747788813332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.03376458246509566,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.03376458246509566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5696969696969697,
"acc_stderr": 0.03866225962879077,
"acc_norm": 0.5696969696969697,
"acc_norm_stderr": 0.03866225962879077
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7098445595854922,
"acc_stderr": 0.03275264467791516,
"acc_norm": 0.7098445595854922,
"acc_norm_stderr": 0.03275264467791516
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.02504919787604234,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.02504919787604234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4369747899159664,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.4369747899159664,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.671559633027523,
"acc_stderr": 0.02013590279729841,
"acc_norm": 0.671559633027523,
"acc_norm_stderr": 0.02013590279729841
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.032036140846700596,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.032036140846700596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.032702871814820796,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.032702871814820796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6708860759493671,
"acc_stderr": 0.03058732629470237,
"acc_norm": 0.6708860759493671,
"acc_norm_stderr": 0.03058732629470237
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04712821257426769,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04712821257426769
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5521472392638037,
"acc_stderr": 0.03906947479456606,
"acc_norm": 0.5521472392638037,
"acc_norm_stderr": 0.03906947479456606
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02934311479809446,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02934311479809446
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6704980842911877,
"acc_stderr": 0.01680832226174046,
"acc_norm": 0.6704980842911877,
"acc_norm_stderr": 0.01680832226174046
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.026897049996382875,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.026897049996382875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2223463687150838,
"acc_stderr": 0.013907189208156881,
"acc_norm": 0.2223463687150838,
"acc_norm_stderr": 0.013907189208156881
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.028624412550167958,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.028624412550167958
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.572347266881029,
"acc_stderr": 0.02809924077580956,
"acc_norm": 0.572347266881029,
"acc_norm_stderr": 0.02809924077580956
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.027513747284379428,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.027513747284379428
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.02883892147125146,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.02883892147125146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.012150699768228556,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.012150699768228556
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.020220920829626916,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.020220920829626916
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5306122448979592,
"acc_stderr": 0.031949171367580624,
"acc_norm": 0.5306122448979592,
"acc_norm_stderr": 0.031949171367580624
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394812,
"mc2": 0.44729919889234016,
"mc2_stderr": 0.015286276115878357
},
"harness|winogrande|5": {
"acc": 0.7324388318863457,
"acc_stderr": 0.01244171845689301
},
"harness|drop|3": {
"em": 0.010906040268456376,
"em_stderr": 0.0010636334198498001,
"f1": 0.06768770973154396,
"f1_stderr": 0.0017077194500790263
},
"harness|gsm8k|5": {
"acc": 0.08642911296436695,
"acc_stderr": 0.007740044337103787
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
betteracs/thai-receipt-ocr-v2 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 59665165.0
num_examples: 933
download_size: 59661693
dataset_size: 59665165.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Changgil__K2S3-SOLAR-11b-v4.0 | ---
pretty_name: Evaluation run of Changgil/K2S3-SOLAR-11b-v4.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Changgil/K2S3-SOLAR-11b-v4.0](https://huggingface.co/Changgil/K2S3-SOLAR-11b-v4.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Changgil__K2S3-SOLAR-11b-v4.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T14:01:21.844121](https://huggingface.co/datasets/open-llm-leaderboard/details_Changgil__K2S3-SOLAR-11b-v4.0/blob/main/results_2024-03-21T14-01-21.844121.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6425151664260456,\n\
\ \"acc_stderr\": 0.031453087920603535,\n \"acc_norm\": 0.6544152950347083,\n\
\ \"acc_norm_stderr\": 0.03231338017710922,\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5163245208016025,\n\
\ \"mc2_stderr\": 0.015033506821956658\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472442,\n\
\ \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068285\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6509659430392352,\n\
\ \"acc_stderr\": 0.004756905819649977,\n \"acc_norm\": 0.847540330611432,\n\
\ \"acc_norm_stderr\": 0.003587312328180707\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.035839017547364134,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.035839017547364134\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923992,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923992\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n\
\ \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n\
\ \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695482995,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695482995\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.024825909793343343,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.024825909793343343\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n\
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590177,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590177\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5925925925925926,\n \"acc_stderr\": 0.033509916046960436,\n \"\
acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.033509916046960436\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \
\ \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7219730941704036,\n\
\ \"acc_stderr\": 0.030069584874494036,\n \"acc_norm\": 0.7219730941704036,\n\
\ \"acc_norm_stderr\": 0.030069584874494036\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.035208939510976534,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.035208939510976534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179337,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179337\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579832,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579832\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n\
\ \"acc_stderr\": 0.015201032512520432,\n \"acc_norm\": 0.2916201117318436,\n\
\ \"acc_norm_stderr\": 0.015201032512520432\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340863,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340863\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49478487614080835,\n\
\ \"acc_stderr\": 0.012769541449652547,\n \"acc_norm\": 0.49478487614080835,\n\
\ \"acc_norm_stderr\": 0.012769541449652547\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.026799562024887674,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.026799562024887674\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528176,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528176\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174934,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174934\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.02372983088101853,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.02372983088101853\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5163245208016025,\n\
\ \"mc2_stderr\": 0.015033506821956658\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498442\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Changgil/K2S3-SOLAR-11b-v4.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|arc:challenge|25_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|gsm8k|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hellaswag|10_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-01-21.844121.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T14-01-21.844121.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- '**/details_harness|winogrande|5_2024-03-21T14-01-21.844121.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T14-01-21.844121.parquet'
- config_name: results
data_files:
- split: 2024_03_21T14_01_21.844121
path:
- results_2024-03-21T14-01-21.844121.parquet
- split: latest
path:
- results_2024-03-21T14-01-21.844121.parquet
---
# Dataset Card for Evaluation run of Changgil/K2S3-SOLAR-11b-v4.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Changgil/K2S3-SOLAR-11b-v4.0](https://huggingface.co/Changgil/K2S3-SOLAR-11b-v4.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Changgil__K2S3-SOLAR-11b-v4.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T14:01:21.844121](https://huggingface.co/datasets/open-llm-leaderboard/details_Changgil__K2S3-SOLAR-11b-v4.0/blob/main/results_2024-03-21T14-01-21.844121.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6425151664260456,
"acc_stderr": 0.031453087920603535,
"acc_norm": 0.6544152950347083,
"acc_norm_stderr": 0.03231338017710922,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5163245208016025,
"mc2_stderr": 0.015033506821956658
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472442,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068285
},
"harness|hellaswag|10": {
"acc": 0.6509659430392352,
"acc_stderr": 0.004756905819649977,
"acc_norm": 0.847540330611432,
"acc_norm_stderr": 0.003587312328180707
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.035839017547364134,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.035839017547364134
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923992,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923992
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695482995,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695482995
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.024825909793343343,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.024825909793343343
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590177,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.033509916046960436,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.033509916046960436
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.02388438092596567
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7219730941704036,
"acc_stderr": 0.030069584874494036,
"acc_norm": 0.7219730941704036,
"acc_norm_stderr": 0.030069584874494036
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.035208939510976534,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.035208939510976534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179337,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179337
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579832,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579832
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.015201032512520432,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.015201032512520432
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340863,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340863
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49478487614080835,
"acc_stderr": 0.012769541449652547,
"acc_norm": 0.49478487614080835,
"acc_norm_stderr": 0.012769541449652547
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.026799562024887674,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.026799562024887674
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528176,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528176
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174934,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174934
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.02372983088101853,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.02372983088101853
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5163245208016025,
"mc2_stderr": 0.015033506821956658
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498442
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_ewqr2130__7B_ppo_phiRM_2GPU_3e-7step_4000 | ---
pretty_name: Evaluation run of ewqr2130/7B_ppo_phiRM_2GPU_3e-7step_4000
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ewqr2130/7B_ppo_phiRM_2GPU_3e-7step_4000](https://huggingface.co/ewqr2130/7B_ppo_phiRM_2GPU_3e-7step_4000)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__7B_ppo_phiRM_2GPU_3e-7step_4000\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-22T22:10:51.590429](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__7B_ppo_phiRM_2GPU_3e-7step_4000/blob/main/results_2024-01-22T22-10-51.590429.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5972023812976862,\n\
\ \"acc_stderr\": 0.033037912443727974,\n \"acc_norm\": 0.6035719763948437,\n\
\ \"acc_norm_stderr\": 0.033725121148129075,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024643,\n \"mc2\": 0.41480772935959465,\n\
\ \"mc2_stderr\": 0.01453565986280891\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5366894197952219,\n \"acc_stderr\": 0.014572000527756989,\n\
\ \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.014456862944650647\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5994821748655647,\n\
\ \"acc_stderr\": 0.00489001935602109,\n \"acc_norm\": 0.8024297948615814,\n\
\ \"acc_norm_stderr\": 0.0039735233080143454\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n\
\ \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.7322580645161291,\n\
\ \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885416,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885416\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.02498535492310235,\n \
\ \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.02498535492310235\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \
\ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"\
acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.033723432716530645,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.033723432716530645\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847834,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847834\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560403,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n\
\ \"acc_stderr\": 0.014711684386139956,\n \"acc_norm\": 0.7841634738186463,\n\
\ \"acc_norm_stderr\": 0.014711684386139956\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153176,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n\
\ \"acc_stderr\": 0.016277927039638193,\n \"acc_norm\": 0.3854748603351955,\n\
\ \"acc_norm_stderr\": 0.016277927039638193\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0267874531119065,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0267874531119065\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.026462487777001862,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.026462487777001862\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41916558018252936,\n\
\ \"acc_stderr\": 0.012602244505788236,\n \"acc_norm\": 0.41916558018252936,\n\
\ \"acc_norm_stderr\": 0.012602244505788236\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.029972807170464622,\n\
\ \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.029972807170464622\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6160130718954249,\n \"acc_stderr\": 0.01967580813528152,\n \
\ \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.01967580813528152\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982062,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982062\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786862,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786862\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036847,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024643,\n \"mc2\": 0.41480772935959465,\n\
\ \"mc2_stderr\": 0.01453565986280891\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.01194759236520739\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2941622441243366,\n \
\ \"acc_stderr\": 0.012551285331470156\n }\n}\n```"
repo_url: https://huggingface.co/ewqr2130/7B_ppo_phiRM_2GPU_3e-7step_4000
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|arc:challenge|25_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|gsm8k|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hellaswag|10_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T22-10-51.590429.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T22-10-51.590429.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- '**/details_harness|winogrande|5_2024-01-22T22-10-51.590429.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-22T22-10-51.590429.parquet'
- config_name: results
data_files:
- split: 2024_01_22T22_10_51.590429
path:
- results_2024-01-22T22-10-51.590429.parquet
- split: latest
path:
- results_2024-01-22T22-10-51.590429.parquet
---
# Dataset Card for Evaluation run of ewqr2130/7B_ppo_phiRM_2GPU_3e-7step_4000
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ewqr2130/7B_ppo_phiRM_2GPU_3e-7step_4000](https://huggingface.co/ewqr2130/7B_ppo_phiRM_2GPU_3e-7step_4000) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewqr2130__7B_ppo_phiRM_2GPU_3e-7step_4000",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T22:10:51.590429](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__7B_ppo_phiRM_2GPU_3e-7step_4000/blob/main/results_2024-01-22T22-10-51.590429.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5972023812976862,
"acc_stderr": 0.033037912443727974,
"acc_norm": 0.6035719763948437,
"acc_norm_stderr": 0.033725121148129075,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024643,
"mc2": 0.41480772935959465,
"mc2_stderr": 0.01453565986280891
},
"harness|arc:challenge|25": {
"acc": 0.5366894197952219,
"acc_stderr": 0.014572000527756989,
"acc_norm": 0.5725255972696246,
"acc_norm_stderr": 0.014456862944650647
},
"harness|hellaswag|10": {
"acc": 0.5994821748655647,
"acc_stderr": 0.00489001935602109,
"acc_norm": 0.8024297948615814,
"acc_norm_stderr": 0.0039735233080143454
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885416,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885416
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198906,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198906
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.02498535492310235,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.02498535492310235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.017381415563608674,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.017381415563608674
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.033723432716530645,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.033723432716530645
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847834,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847834
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560403,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.014711684386139956,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.014711684386139956
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153176,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3854748603351955,
"acc_stderr": 0.016277927039638193,
"acc_norm": 0.3854748603351955,
"acc_norm_stderr": 0.016277927039638193
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0267874531119065,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0267874531119065
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.026462487777001862,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.026462487777001862
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41916558018252936,
"acc_stderr": 0.012602244505788236,
"acc_norm": 0.41916558018252936,
"acc_norm_stderr": 0.012602244505788236
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.029972807170464622,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.029972807170464622
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.01967580813528152,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.01967580813528152
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982062,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982062
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786862,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786862
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024643,
"mc2": 0.41480772935959465,
"mc2_stderr": 0.01453565986280891
},
"harness|winogrande|5": {
"acc": 0.7632202052091555,
"acc_stderr": 0.01194759236520739
},
"harness|gsm8k|5": {
"acc": 0.2941622441243366,
"acc_stderr": 0.012551285331470156
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
H34lthy/NGCSE_hybrid_dataset | ---
license: mit
---
|
tiiuae/falcon-refinedweb | ---
dataset_info:
features:
- name: content
dtype: string
- name: url
dtype: string
- name: timestamp
dtype: timestamp[s]
- name: dump
dtype: string
- name: segment
dtype: string
- name: image_urls
sequence:
sequence: string
splits:
- name: train
num_bytes: 2766953721769
num_examples: 968000015
download_size: 466888198663
dataset_size: 2766953721769
license: odc-by
task_categories:
- text-generation
language:
- en
pretty_name: Falcon RefinedWeb
size_categories:
- 100B<n<1T
---
# 📀 Falcon RefinedWeb
**Falcon RefinedWeb is a massive English web dataset built by [TII](https://www.tii.ae) and released under an ODC-By 1.0 license.**
See the 📓 [paper on arXiv](https://arxiv.org/abs/2306.01116) for more details.
RefinedWeb is built through stringent filtering and large-scale deduplication of CommonCrawl; we found models trained on RefinedWeb to achieve performance in-line or better than models trained on curated datasets, while only relying on web data.
RefinedWeb is also "multimodal-friendly": it contains links and alt texts for images in processed samples.
This public extract should contain 500-650GT depending on the tokenizer you use, and can be enhanced with the curated corpora of your choosing. This public extract is about ~500GB to download, requiring 2.8TB of local storage once unpacked.
```python
from datasets import load_dataset
rw = load_dataset("tiiuae/falcon-refinedweb")
```
RefinedWeb is the main dataset we have used for training the [Falcon LLM](https://falconllm.tii.ae) models:
* It was used in conjunction with a curated corpora to train Falcon-[7B](https://huggingface.co/tiiuae/falcon-7b)/[40B](https://huggingface.co/tiiuae/falcon-40b), two state-of-the-art open-source models.
* It was also used to train Falcon-RW-[1B](https://huggingface.co/tiiuae/falcon-rw-1b)/[7B](https://huggingface.co/tiiuae/falcon-rw-7b), two models trained on 350 billion tokens of RefinedWeb alone to demonstrate its quality compared to curated corpora.
# Dataset card for Falcon RefinedWeb
## Dataset Description
* **Homepage:** [falconllm.tii.ae](falconllm.tii.ae)
* **Paper:** [https://arxiv.org/abs/2306.01116](https://arxiv.org/abs/2306.01116)
* **Point of Contact:** [falconllm@tii.ae](mailto:falconllm@tii.ae)
### Dataset Summary
Falcon RefinedWeb was created to serve as an English large-scale dataset for the pretraining of large language models. It may be used on its own, or augmented with curated sources (e.g., Wikipedia, StackOverflow).
It was built on top of CommonCrawl, leveraging stringent filtering and extensive deduplication.
### Supported Tasks and Leaderboards
RefinedWeb is intended to be primarly used as a pretraining dataset for large language models. Practitioners may leverage it for upstream evaluation with a validation loss, but we do not provide any canonical split.
### Languages
RefinedWeb primarly contains English.
## Dataset Structure
### Data Instances
Each data instance corresponds to an individual web page which has been crawled, processed, and deduplicated against all other instances.
This public extract of RefinedWeb contains about 1B instances (968M individual web pages), for a total of 2.8TB of clean text data.
### Data Fields
* `content`: the processed and cleaned text contained in the page;
* `url`: the url of the webpage crawled to produce the sample;
* `timestamp`: timestamp of when the webpage was crawled by CommonCrawl;
* `dump`: the CommonCrawl dump the sample is a part of;
* `segment`: the CommonCrawl segment the sample is a part of;
* `image_urls`: a list of elements in the type [`image_url`, `image_alt_text`] for all the images found in the content of the sample.
### Data Splits
We do not provide any canonical splits for RefinedWeb.
## Dataset Creation
### Curation Rationale
Falcon RefinedWeb is built on-top of [CommonCrawl](https://commoncrawl.org), using the Macrodata Refinement Pipeline, which combines content extraction, filtering heuristics, and deduplication.
In designing RefinedWeb, we abided to the following philosophy:
* (1) **Scale first.** We intend MDR to produce datasets to be used to train 40-200B parameters models, thus requiring trillions of tokens [(Hoffmann et al., 2022)](https://arxiv.org/abs/2203.15556). For English-only RefinedWeb, we target a size of 3-6 trillion tokens. Specifically, we eschew any labour intensive human curation process, and focus on CommonCrawl instead of disparate single-domain sources.
* (2) **Strict deduplication.** Inspired by the work of [Lee et al., 2021](https://arxiv.org/abs/2107.06499), which demonstrated the value of deduplication for large language models, we implement a rigorous deduplication pipeline. We combine both exact and fuzzy deduplication, and use strict settings leading to removal rates far higher than others datasets have reported.
* (3) **Neutral filtering.** To avoid introducing further undesirable biases into the model, we avoid using ML-based filtering outside of language identification ([Dodge et al., 2021](https://arxiv.org/abs/2104.08758); [Welbl et al., 2021](https://arxiv.org/abs/2109.07445)) . We stick to simple rules and heuristics, and use only URL filtering for adult content.
During its development, we iterated on RefinedWeb by measuring the zero-shot performance of models trained on development version of the dataset. Our main goal was to maximize the performance obtained, bridging the gap between curated and web data. We also manually audited samples to identify potential filtering improvements.
### Source Data
RefinedWeb is built from [CommonCrawl](https://commoncrawl.org) dumps. These dumps are constructed from crawling publicly available web pages.
### Data Collection and Preprocessing
We applied extensive preprocessing and cleaning of the data, using our Macrodata Refinement Pipeline.
We first filter URLs to remove adult content using a blocklist and a score system, we then use `trafilatura` to extract content from pages, and perform language identification with the `fastText` classifier from CCNet ([Wenzek et al., 2019](https://arxiv.org/abs/1911.00359)). After this first preprocessing stage, we filter data using heuristics from MassiveWeb ([Rae et al., 2021](https://arxiv.org/abs/2112.11446)), and our own line-wise corrections.
Finally, we run extensive deduplication, removing URLs revisited across dumps and performing subsequently fuzzy and exact substring deduplication.
### Annotations
We provide automatically collected annotations for the source `url`, `timestamp` of the crawl, original CommonCrawl `dump` and `segment` in which the document was found, and `image_urls` contained in the page.
### Personal and Sensitive Information
As RefinedWeb is built upon publicly available web pages, it may contain sensitive information such as emails, phone numbers, or IP addresses. We believe that deduplication may have helped reduced the prevalence of PII in the dataset, but practitioners working with RefinedWeb should take care.
## Considerations for Using the Data
### Social Impact of Dataset
With the open-source release of Falcon RefinedWeb, we aim to increase access to high-quality web data, which has typically been held private by model developers. We believe this release will in turn improve the accessibility and the spread of performant large language models.
### Discussion of Biases
As toxic or biased data is prevalent on the internet, it is likely our dataset contains such content. Notably, using the Perspective API, we estimated the prevalence of toxic content in the dataset to be similar to The Pile.
### Other Known Limitations
Despite our best efforts to filter content that does not qualify as natural language, and to deduplicate documents, our pipeline may let through documents that may be considered as errors or redundant.
## Additional Information
### Licensing Information
This public extract is made available under an [ODC-By 1.0](https://opendatacommons.org/licenses/by/1-0/) license; users should also abide to the [CommonCrawl ToU](https://commoncrawl.org/terms-of-use/).
### Citation Information
```
@article{refinedweb,
title={The {R}efined{W}eb dataset for {F}alcon {LLM}: outperforming curated corpora with web data, and web data only},
author={Guilherme Penedo and Quentin Malartic and Daniel Hesslow and Ruxandra Cojocaru and Alessandro Cappelli and Hamza Alobeidli and Baptiste Pannier and Ebtesam Almazrouei and Julien Launay},
journal={arXiv preprint arXiv:2306.01116},
eprint={2306.01116},
eprinttype = {arXiv},
url={https://arxiv.org/abs/2306.01116},
year={2023}
}
```
### Opt-out request
RefinedWeb is based on [CommonCrawl](https://commoncrawl.org/). Their crawler honors opt-out requests in the `robots.txt`, see the [CC FAQ](https://commoncrawl.org/big-picture/frequently-asked-questions/) for details.
To remove a document from RefinedWeb, please message falconllm@tii.ae.
### Contact
falconllm@tii.ae |
KayoSilva88777/CarlosFernandes | ---
license: openrail
---
|
Jing24/seperate_all3 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 29434260
num_examples: 32797
download_size: 5385948
dataset_size: 29434260
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_all3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rejulraghu/HolyC | ---
license: mit
---
|
CyberHarem/koyasu_tsubame | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Koyasu Tsubame
This is the dataset of Koyasu Tsubame, containing 20 images and their tags.
The core tags of this character are `pink_hair, pink_eyes, breasts, hair_bun, single_hair_bun, red_eyes, earrings, hairband, ribbon, short_hair, bangs, hair_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 30.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koyasu_tsubame/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 14.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koyasu_tsubame/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 50 | 33.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koyasu_tsubame/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 25.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koyasu_tsubame/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 50 | 53.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koyasu_tsubame/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/koyasu_tsubame',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, smile, jewelry, open_mouth, shuuchiin_academy_school_uniform, collarbone, simple_background, white_background, black_dress, long_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | blush | smile | jewelry | open_mouth | shuuchiin_academy_school_uniform | collarbone | simple_background | white_background | black_dress | long_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:--------|:----------|:-------------|:-----------------------------------|:-------------|:--------------------|:-------------------|:--------------|:---------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Hadnet/olavo-article-17k-llama2-chat-dataset-text | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 9693532
num_examples: 17361
download_size: 5505395
dataset_size: 9693532
---
# Dataset Card for "olavo-article-17k-llama2-chat-dataset-text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/fake-library-chats-with-sentiment | ---
dataset_info:
- config_name: default
features:
- name: message
dtype: string
- name: message sentiment
dtype:
class_label:
names:
'0': positive
'1': negative
'2': neutral
splits:
- name: train
num_bytes: 674584
num_examples: 10000
download_size: 0
dataset_size: 674584
- config_name: demo
features:
- name: message
dtype: string
- name: message sentiment
dtype:
class_label:
names:
'0': positive
'1': negative
'2': neutral
splits:
- name: train
num_bytes: 674584
num_examples: 10000
download_size: 28880
dataset_size: 674584
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: demo
data_files:
- split: train
path: demo/train-*
---
# Dataset Card for "fake-library-chats-with-sentiment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cfilt/iitb-english-hindi | ---
language:
- en
- hi
---
<p align="center"><img src="https://huggingface.co/datasets/cfilt/HiNER-collapsed/raw/main/cfilt-dark-vec.png" alt="Computation for Indian Language Technology Logo" width="150" height="150"/></p>
# IITB-English-Hindi Parallel Corpus
[](https://creativecommons.org/licenses/by-nc/4.0/)
[](https://twitter.com/cfiltnlp)
[](https://twitter.com/PeopleCentredAI)
## About
The IIT Bombay English-Hindi corpus contains parallel corpus for English-Hindi as well as monolingual Hindi corpus collected from a variety of existing sources and corpora developed at the Center for Indian Language Technology, IIT Bombay over the years. This page describes the corpus. This corpus has been used at the Workshop on Asian Language Translation Shared Task since 2016 the Hindi-to-English and English-to-Hindi languages pairs and as a pivot language pair for the Hindi-to-Japanese and Japanese-to-Hindi language pairs.
The complete details of this corpus are available at [this URL](https://www.cfilt.iitb.ac.in/iitb_parallel/). We also provide this parallel corpus via browser download from the same URL. We also provide a monolingual Hindi corpus on the same URL.
### Recent Updates
* Version 3.1 - December 2021 - Added 49,400 sentence pairs to the parallel corpus.
* Version 3.0 - August 2020 - Added ~47,000 sentence pairs to the parallel corpus.
## Usage
We provide a notebook that shows how to import the IITB English-Hindi Parallel Corpus from the HuggingFace datasets repository. The notebook also shows how to segment the corpus using BPE tokenization which can be used to train an English-Hindi MT System.
[https://github.com/cfiltnlp/IITB-English-Hindi-PC](https://github.com/cfiltnlp/IITB-English-Hindi-PC)
## Other
You can find a catalogue of other English-Hindi and other Indian language parallel corpora here: [Indic NLP Catalog](https://github.com/indicnlpweb/indicnlp_catalog)
## Maintainer(s)
[Diptesh Kanojia](https://dipteshkanojia.github.io)<br/>
Shivam Mhasker<br/>
## Citation
If you use this corpus or its derivate resources for your research, kindly cite it as follows:
Anoop Kunchukuttan, Pratik Mehta, Pushpak Bhattacharyya. The IIT Bombay English-Hindi Parallel Corpus. Language Resources and Evaluation Conference. 2018.
### BiBTeX Citation
```latex
@inproceedings{kunchukuttan-etal-2018-iit,
title = "The {IIT} {B}ombay {E}nglish-{H}indi Parallel Corpus",
author = "Kunchukuttan, Anoop and
Mehta, Pratik and
Bhattacharyya, Pushpak",
booktitle = "Proceedings of the Eleventh International Conference on Language Resources and Evaluation ({LREC} 2018)",
month = may,
year = "2018",
address = "Miyazaki, Japan",
publisher = "European Language Resources Association (ELRA)",
url = "https://aclanthology.org/L18-1548",
}
``` |
ludis/geepeetee4 | ---
pretty_name: geepeetee4
tags:
- not-for-all-audiences
- conversational
- roleplay
size_categories:
- 1K<n<10K
---
the csvs r from ludis/whocars
i only ran that script on it, i didn't do anything else to the data so take it as you will |
CyberHarem/biscuit_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of biscuit/ビスケット/饼干/비스킷 (Nikke: Goddess of Victory)
This is the dataset of biscuit/ビスケット/饼干/비스킷 (Nikke: Goddess of Victory), containing 19 images and their tags.
The core tags of this character are `hair_ornament, yellow_eyes, animal_ears, tail, ahoge, bow, breasts, brown_hair, medium_hair, bangs, dog_ears, hair_bow, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 38.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/biscuit_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 18.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/biscuit_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 48 | 43.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/biscuit_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 32.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/biscuit_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 48 | 66.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/biscuit_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/biscuit_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, solo, blush, open_mouth, long_sleeves, looking_at_viewer, smile, dog, dress, virtual_youtuber, shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | open_mouth | long_sleeves | looking_at_viewer | smile | dog | dress | virtual_youtuber | shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------------|:---------------|:--------------------|:--------|:------|:--------|:-------------------|:--------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
|
AdapterOcean/math_dataset_standardized_cluster_0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 41690673
num_examples: 4461
download_size: 11292125
dataset_size: 41690673
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "math_dataset_standardized_cluster_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
laion/laion2B-multi-watermark | Invalid username or password. |
CyberHarem/koshigaya_komari_nonnonbiyori | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Koshigaya Komari
This is the dataset of Koshigaya Komari, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 746 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 833 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 746 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 746 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 584 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 833 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 833 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
fathyshalab/massive_transport | ---
dataset_info:
features:
- name: id
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 34823
num_examples: 571
- name: validation
num_bytes: 6699
num_examples: 110
- name: test
num_bytes: 7228
num_examples: 124
download_size: 0
dataset_size: 48750
---
# Dataset Card for "massive_transport"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigscience-data/roots_indic-ta_wikiquote | ---
language: ta
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-ta_wikiquote
# wikiquote_filtered
- Dataset uid: `wikiquote_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0462 % of total
- 0.1697 % of en
- 0.0326 % of fr
- 0.0216 % of ar
- 0.0066 % of zh
- 0.0833 % of pt
- 0.0357 % of es
- 0.0783 % of indic-ta
- 0.0361 % of indic-hi
- 0.0518 % of ca
- 0.0405 % of vi
- 0.0834 % of indic-ml
- 0.0542 % of indic-te
- 0.1172 % of indic-gu
- 0.0634 % of indic-kn
- 0.0539 % of id
- 0.0454 % of indic-ur
- 0.0337 % of indic-mr
- 0.0347 % of eu
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ar
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ta
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- filter_small_docs_bytes_300
#### Filters applied to: ca
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ca
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_vi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ml
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-te
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-gu
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-kn
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: id
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_id
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-mr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_eu
- dedup_template_soft
- replace_newline_with_space
|
liyucheng/allsides_metaphor | ---
dataset_info:
features:
- name: urls
dtype: string
- name: sents
sequence: string
- name: vua_metaphors
sequence: int64
- name: novel_metaphors
sequence: int64
splits:
- name: train
num_bytes: 23322603
num_examples: 28883
download_size: 2935494
dataset_size: 23322603
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "allsides_metaphor"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alancooney/relation_counterfact | ---
license: mit
---
# Dataset Card for Relation Counterfact
## Dataset Description
- **Repository:** https://github.com/bilal-chughtai/attention-head-superposition
- **Paper:** TBC
- **Point of Contact:** Alan Cooney
### Dataset Summary
This dataset is based on the [Rome Counterfact dataset](https://rome.baulab.info/).
It is adjusted so that all re-written prompts are in the same order and form of
[Subject] [Relationship] -> [Attribute] (e.g. Beats Music is owned by -> Apple).
The dataset also uses additional validation rules to remove examples where the
correct attribute is ambiguous, as well as other unsuitable dataset examples
(e.g. where the attribute is included in the subject such as "Porsche 911 is
created by -> Porsche"). Finally a list of `subject_relevant_words` is included
to help study the residual stream at different points.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sordonia/adauni-v1-flat | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: task_name
dtype: string
- name: task_source
dtype: string
- name: split
dtype: string
splits:
- name: train
num_bytes: 7385230805
num_examples: 3928352
download_size: 0
dataset_size: 7385230805
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Used datasets:
## sordonia/flan-10k-flat
## sordonia/mmlu-qa-flat
## sordonia/platypus-flat
## sordonia/ultrachat-32c-10k-flat
## Total number of tasks: 439
|
ambrosemcduffy/newAnim_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 24444584.0
num_examples: 53
download_size: 24332541
dataset_size: 24444584.0
---
# Dataset Card for "newAnim_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chronbmm/sanskrit-monolingual-pretraining | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1577362939
num_examples: 21371582
- name: validation
num_bytes: 8601369
num_examples: 26247
- name: test
num_bytes: 8601369
num_examples: 26247
download_size: 850823877
dataset_size: 1594565677
---
# Dataset Card for "sanskrit-monolingual-pretraining"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-7b-chat | ---
pretty_name: Evaluation run of openthaigpt/openthaigpt-1.0.0-7b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openthaigpt/openthaigpt-1.0.0-7b-chat](https://huggingface.co/openthaigpt/openthaigpt-1.0.0-7b-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-7b-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-08T09:02:56.717414](https://huggingface.co/datasets/open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-7b-chat/blob/main/results_2024-04-08T09-02-56.717414.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.35718570026420443,\n\
\ \"acc_stderr\": 0.03360931297367493,\n \"acc_norm\": 0.3606105044554362,\n\
\ \"acc_norm_stderr\": 0.03441254371222723,\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882445,\n \"mc2\": 0.47088194473261846,\n\
\ \"mc2_stderr\": 0.015649524281312666\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4197952218430034,\n \"acc_stderr\": 0.014422181226303026,\n\
\ \"acc_norm\": 0.44197952218430037,\n \"acc_norm_stderr\": 0.014512682523128345\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.536247759410476,\n\
\ \"acc_stderr\": 0.004976651989757643,\n \"acc_norm\": 0.7131049591714798,\n\
\ \"acc_norm_stderr\": 0.00451387746506212\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640766,\n\
\ \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640766\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3622641509433962,\n \"acc_stderr\": 0.0295822451283843,\n\
\ \"acc_norm\": 0.3622641509433962,\n \"acc_norm_stderr\": 0.0295822451283843\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2986111111111111,\n\
\ \"acc_stderr\": 0.038270523579507554,\n \"acc_norm\": 0.2986111111111111,\n\
\ \"acc_norm_stderr\": 0.038270523579507554\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.35260115606936415,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.35260115606936415,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.03141082197596241,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.03141082197596241\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03960933549451208,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03960933549451208\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633345,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633345\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04006168083848877,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04006168083848877\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3774193548387097,\n \"acc_stderr\": 0.027575960723278243,\n \"\
acc_norm\": 0.3774193548387097,\n \"acc_norm_stderr\": 0.027575960723278243\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.22167487684729065,\n \"acc_stderr\": 0.029225575892489614,\n \"\
acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.029225575892489614\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.03898531605579419,\n\
\ \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.03898531605579419\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.32323232323232326,\n \"acc_stderr\": 0.03332299921070644,\n \"\
acc_norm\": 0.32323232323232326,\n \"acc_norm_stderr\": 0.03332299921070644\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.43523316062176165,\n \"acc_stderr\": 0.03578038165008587,\n\
\ \"acc_norm\": 0.43523316062176165,\n \"acc_norm_stderr\": 0.03578038165008587\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3128205128205128,\n \"acc_stderr\": 0.023507579020645365,\n\
\ \"acc_norm\": 0.3128205128205128,\n \"acc_norm_stderr\": 0.023507579020645365\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712173,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712173\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.030388353551886845,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.030388353551886845\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.43486238532110094,\n \"acc_stderr\": 0.021254631465609273,\n \"\
acc_norm\": 0.43486238532110094,\n \"acc_norm_stderr\": 0.021254631465609273\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18981481481481483,\n \"acc_stderr\": 0.026744714834691936,\n \"\
acc_norm\": 0.18981481481481483,\n \"acc_norm_stderr\": 0.026744714834691936\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4215686274509804,\n \"acc_stderr\": 0.03465868196380757,\n \"\
acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.03465868196380757\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5189873417721519,\n \"acc_stderr\": 0.03252375148090448,\n \
\ \"acc_norm\": 0.5189873417721519,\n \"acc_norm_stderr\": 0.03252375148090448\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4080717488789238,\n\
\ \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.4080717488789238,\n\
\ \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.37404580152671757,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.37404580152671757,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5041322314049587,\n \"acc_stderr\": 0.04564198767432754,\n \"\
acc_norm\": 0.5041322314049587,\n \"acc_norm_stderr\": 0.04564198767432754\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.39805825242718446,\n \"acc_stderr\": 0.04846748253977238,\n\
\ \"acc_norm\": 0.39805825242718446,\n \"acc_norm_stderr\": 0.04846748253977238\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5085470085470085,\n\
\ \"acc_stderr\": 0.0327513030009703,\n \"acc_norm\": 0.5085470085470085,\n\
\ \"acc_norm_stderr\": 0.0327513030009703\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4508301404853129,\n\
\ \"acc_stderr\": 0.017793297572699037,\n \"acc_norm\": 0.4508301404853129,\n\
\ \"acc_norm_stderr\": 0.017793297572699037\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4161849710982659,\n \"acc_stderr\": 0.026538189104705477,\n\
\ \"acc_norm\": 0.4161849710982659,\n \"acc_norm_stderr\": 0.026538189104705477\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261441,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261441\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.33986928104575165,\n \"acc_stderr\": 0.027121956071388852,\n\
\ \"acc_norm\": 0.33986928104575165,\n \"acc_norm_stderr\": 0.027121956071388852\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3762057877813505,\n\
\ \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.3762057877813505,\n\
\ \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3611111111111111,\n \"acc_stderr\": 0.026725868809100786,\n\
\ \"acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.026725868809100786\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.31560283687943264,\n \"acc_stderr\": 0.027724989449509314,\n \
\ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.027724989449509314\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2953063885267275,\n\
\ \"acc_stderr\": 0.011651061936208821,\n \"acc_norm\": 0.2953063885267275,\n\
\ \"acc_norm_stderr\": 0.011651061936208821\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.24632352941176472,\n \"acc_stderr\": 0.02617343857052,\n\
\ \"acc_norm\": 0.24632352941176472,\n \"acc_norm_stderr\": 0.02617343857052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.35947712418300654,\n \"acc_stderr\": 0.01941253924203216,\n \
\ \"acc_norm\": 0.35947712418300654,\n \"acc_norm_stderr\": 0.01941253924203216\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.42727272727272725,\n\
\ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.42727272727272725,\n\
\ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3020408163265306,\n \"acc_stderr\": 0.029393609319879815,\n\
\ \"acc_norm\": 0.3020408163265306,\n \"acc_norm_stderr\": 0.029393609319879815\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43283582089552236,\n\
\ \"acc_stderr\": 0.03503490923673282,\n \"acc_norm\": 0.43283582089552236,\n\
\ \"acc_norm_stderr\": 0.03503490923673282\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4269005847953216,\n \"acc_stderr\": 0.03793620616529917,\n\
\ \"acc_norm\": 0.4269005847953216,\n \"acc_norm_stderr\": 0.03793620616529917\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882445,\n \"mc2\": 0.47088194473261846,\n\
\ \"mc2_stderr\": 0.015649524281312666\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.659037095501184,\n \"acc_stderr\": 0.013322681435934807\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.052312357846853674,\n \
\ \"acc_stderr\": 0.006133057708959229\n }\n}\n```"
repo_url: https://huggingface.co/openthaigpt/openthaigpt-1.0.0-7b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|arc:challenge|25_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|gsm8k|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hellaswag|10_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T09-02-56.717414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T09-02-56.717414.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- '**/details_harness|winogrande|5_2024-04-08T09-02-56.717414.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-08T09-02-56.717414.parquet'
- config_name: results
data_files:
- split: 2024_04_08T09_02_56.717414
path:
- results_2024-04-08T09-02-56.717414.parquet
- split: latest
path:
- results_2024-04-08T09-02-56.717414.parquet
---
# Dataset Card for Evaluation run of openthaigpt/openthaigpt-1.0.0-7b-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [openthaigpt/openthaigpt-1.0.0-7b-chat](https://huggingface.co/openthaigpt/openthaigpt-1.0.0-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-7b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-08T09:02:56.717414](https://huggingface.co/datasets/open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-7b-chat/blob/main/results_2024-04-08T09-02-56.717414.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.35718570026420443,
"acc_stderr": 0.03360931297367493,
"acc_norm": 0.3606105044554362,
"acc_norm_stderr": 0.03441254371222723,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882445,
"mc2": 0.47088194473261846,
"mc2_stderr": 0.015649524281312666
},
"harness|arc:challenge|25": {
"acc": 0.4197952218430034,
"acc_stderr": 0.014422181226303026,
"acc_norm": 0.44197952218430037,
"acc_norm_stderr": 0.014512682523128345
},
"harness|hellaswag|10": {
"acc": 0.536247759410476,
"acc_stderr": 0.004976651989757643,
"acc_norm": 0.7131049591714798,
"acc_norm_stderr": 0.00451387746506212
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640766,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640766
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3622641509433962,
"acc_stderr": 0.0295822451283843,
"acc_norm": 0.3622641509433962,
"acc_norm_stderr": 0.0295822451283843
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2986111111111111,
"acc_stderr": 0.038270523579507554,
"acc_norm": 0.2986111111111111,
"acc_norm_stderr": 0.038270523579507554
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.35260115606936415,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.35260115606936415,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.03141082197596241,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.03141082197596241
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489361,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489361
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633345,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633345
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848877,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848877
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3774193548387097,
"acc_stderr": 0.027575960723278243,
"acc_norm": 0.3774193548387097,
"acc_norm_stderr": 0.027575960723278243
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22167487684729065,
"acc_stderr": 0.029225575892489614,
"acc_norm": 0.22167487684729065,
"acc_norm_stderr": 0.029225575892489614
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.03898531605579419,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.03898531605579419
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.32323232323232326,
"acc_stderr": 0.03332299921070644,
"acc_norm": 0.32323232323232326,
"acc_norm_stderr": 0.03332299921070644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.43523316062176165,
"acc_stderr": 0.03578038165008587,
"acc_norm": 0.43523316062176165,
"acc_norm_stderr": 0.03578038165008587
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3128205128205128,
"acc_stderr": 0.023507579020645365,
"acc_norm": 0.3128205128205128,
"acc_norm_stderr": 0.023507579020645365
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712173,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712173
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.030388353551886845,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.030388353551886845
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.43486238532110094,
"acc_stderr": 0.021254631465609273,
"acc_norm": 0.43486238532110094,
"acc_norm_stderr": 0.021254631465609273
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18981481481481483,
"acc_stderr": 0.026744714834691936,
"acc_norm": 0.18981481481481483,
"acc_norm_stderr": 0.026744714834691936
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.03465868196380757,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.03465868196380757
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5189873417721519,
"acc_stderr": 0.03252375148090448,
"acc_norm": 0.5189873417721519,
"acc_norm_stderr": 0.03252375148090448
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4080717488789238,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.4080717488789238,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.37404580152671757,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.37404580152671757,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5041322314049587,
"acc_stderr": 0.04564198767432754,
"acc_norm": 0.5041322314049587,
"acc_norm_stderr": 0.04564198767432754
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04803752235190193,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04803752235190193
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.39805825242718446,
"acc_stderr": 0.04846748253977238,
"acc_norm": 0.39805825242718446,
"acc_norm_stderr": 0.04846748253977238
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5085470085470085,
"acc_stderr": 0.0327513030009703,
"acc_norm": 0.5085470085470085,
"acc_norm_stderr": 0.0327513030009703
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4508301404853129,
"acc_stderr": 0.017793297572699037,
"acc_norm": 0.4508301404853129,
"acc_norm_stderr": 0.017793297572699037
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.026538189104705477,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.026538189104705477
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261441,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261441
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.33986928104575165,
"acc_stderr": 0.027121956071388852,
"acc_norm": 0.33986928104575165,
"acc_norm_stderr": 0.027121956071388852
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3762057877813505,
"acc_stderr": 0.02751392568354943,
"acc_norm": 0.3762057877813505,
"acc_norm_stderr": 0.02751392568354943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.026725868809100786,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.026725868809100786
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.027724989449509314,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.027724989449509314
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2953063885267275,
"acc_stderr": 0.011651061936208821,
"acc_norm": 0.2953063885267275,
"acc_norm_stderr": 0.011651061936208821
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.24632352941176472,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.24632352941176472,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35947712418300654,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.35947712418300654,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.42727272727272725,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.42727272727272725,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3020408163265306,
"acc_stderr": 0.029393609319879815,
"acc_norm": 0.3020408163265306,
"acc_norm_stderr": 0.029393609319879815
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43283582089552236,
"acc_stderr": 0.03503490923673282,
"acc_norm": 0.43283582089552236,
"acc_norm_stderr": 0.03503490923673282
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4269005847953216,
"acc_stderr": 0.03793620616529917,
"acc_norm": 0.4269005847953216,
"acc_norm_stderr": 0.03793620616529917
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882445,
"mc2": 0.47088194473261846,
"mc2_stderr": 0.015649524281312666
},
"harness|winogrande|5": {
"acc": 0.659037095501184,
"acc_stderr": 0.013322681435934807
},
"harness|gsm8k|5": {
"acc": 0.052312357846853674,
"acc_stderr": 0.006133057708959229
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Supreeta03/CREMA-melSpecImages | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Anger
'1': Happy
'2': Fear
'3': Sad
'4': Disgust
'5': Neutral
splits:
- name: train
num_bytes: 365733343.75
num_examples: 7442
download_size: 365604204
dataset_size: 365733343.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anjunhu/naively_captioned_CUB2002011_test_9shot | ---
dataset_info:
features:
- name: text
dtype: string
- name: text_cupl
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 49482951.0
num_examples: 1800
download_size: 43961740
dataset_size: 49482951.0
---
# Dataset Card for "naively_captioned_CUB2002011_test_9shot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_silvainrichou__gemma-3b-002 | ---
pretty_name: Evaluation run of silvainrichou/gemma-3b-002
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [silvainrichou/gemma-3b-002](https://huggingface.co/silvainrichou/gemma-3b-002)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_silvainrichou__gemma-3b-002\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-15T14:09:20.686818](https://huggingface.co/datasets/open-llm-leaderboard/details_silvainrichou__gemma-3b-002/blob/main/results_2024-03-15T14-09-20.686818.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3698166825711143,\n\
\ \"acc_stderr\": 0.033990488250744445,\n \"acc_norm\": 0.3743393450227912,\n\
\ \"acc_norm_stderr\": 0.03480240983497677,\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.42683291459811795,\n\
\ \"mc2_stderr\": 0.015100542524354723\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4052901023890785,\n \"acc_stderr\": 0.014346869060229328,\n\
\ \"acc_norm\": 0.4334470989761092,\n \"acc_norm_stderr\": 0.014481376224558898\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.47769368651663013,\n\
\ \"acc_stderr\": 0.004984813391016206,\n \"acc_norm\": 0.6406094403505278,\n\
\ \"acc_norm_stderr\": 0.004788412062375705\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03583496176361063,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03583496176361063\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4188679245283019,\n \"acc_stderr\": 0.03036505082911521,\n\
\ \"acc_norm\": 0.4188679245283019,\n \"acc_norm_stderr\": 0.03036505082911521\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.03106898596312215,\n\
\ \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.03106898596312215\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.041042692118062316,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.041042692118062316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113935,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113935\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4096774193548387,\n\
\ \"acc_stderr\": 0.027976054915347357,\n \"acc_norm\": 0.4096774193548387,\n\
\ \"acc_norm_stderr\": 0.027976054915347357\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3251231527093596,\n \"acc_stderr\": 0.03295797566311271,\n\
\ \"acc_norm\": 0.3251231527093596,\n \"acc_norm_stderr\": 0.03295797566311271\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03681050869161549,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03681050869161549\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4393939393939394,\n \"acc_stderr\": 0.035360859475294805,\n \"\
acc_norm\": 0.4393939393939394,\n \"acc_norm_stderr\": 0.035360859475294805\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.41968911917098445,\n \"acc_stderr\": 0.035615873276858834,\n\
\ \"acc_norm\": 0.41968911917098445,\n \"acc_norm_stderr\": 0.035615873276858834\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31794871794871793,\n \"acc_stderr\": 0.023610884308927865,\n\
\ \"acc_norm\": 0.31794871794871793,\n \"acc_norm_stderr\": 0.023610884308927865\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.42935779816513764,\n \"acc_stderr\": 0.021222286397236508,\n \"\
acc_norm\": 0.42935779816513764,\n \"acc_norm_stderr\": 0.021222286397236508\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2916666666666667,\n \"acc_stderr\": 0.03099866630456053,\n \"\
acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.03099866630456053\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.36764705882352944,\n \"acc_stderr\": 0.03384132045674119,\n \"\
acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.03384132045674119\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3755274261603376,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.3755274261603376,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.452914798206278,\n\
\ \"acc_stderr\": 0.03340867501923324,\n \"acc_norm\": 0.452914798206278,\n\
\ \"acc_norm_stderr\": 0.03340867501923324\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3816793893129771,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.3816793893129771,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5371900826446281,\n \"acc_stderr\": 0.04551711196104218,\n \"\
acc_norm\": 0.5371900826446281,\n \"acc_norm_stderr\": 0.04551711196104218\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4563106796116505,\n \"acc_stderr\": 0.049318019942204146,\n\
\ \"acc_norm\": 0.4563106796116505,\n \"acc_norm_stderr\": 0.049318019942204146\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5299145299145299,\n\
\ \"acc_stderr\": 0.032697411068124425,\n \"acc_norm\": 0.5299145299145299,\n\
\ \"acc_norm_stderr\": 0.032697411068124425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.45977011494252873,\n\
\ \"acc_stderr\": 0.01782199409693354,\n \"acc_norm\": 0.45977011494252873,\n\
\ \"acc_norm_stderr\": 0.01782199409693354\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.38439306358381503,\n \"acc_stderr\": 0.026189666966272035,\n\
\ \"acc_norm\": 0.38439306358381503,\n \"acc_norm_stderr\": 0.026189666966272035\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2324022346368715,\n\
\ \"acc_stderr\": 0.01412596875467338,\n \"acc_norm\": 0.2324022346368715,\n\
\ \"acc_norm_stderr\": 0.01412596875467338\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.39869281045751637,\n \"acc_stderr\": 0.02803609227389176,\n\
\ \"acc_norm\": 0.39869281045751637,\n \"acc_norm_stderr\": 0.02803609227389176\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3890675241157556,\n\
\ \"acc_stderr\": 0.027690337536485376,\n \"acc_norm\": 0.3890675241157556,\n\
\ \"acc_norm_stderr\": 0.027690337536485376\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.44753086419753085,\n \"acc_stderr\": 0.027667138569422694,\n\
\ \"acc_norm\": 0.44753086419753085,\n \"acc_norm_stderr\": 0.027667138569422694\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.02646903681859063,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.02646903681859063\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2894393741851369,\n\
\ \"acc_stderr\": 0.011582659702210236,\n \"acc_norm\": 0.2894393741851369,\n\
\ \"acc_norm_stderr\": 0.011582659702210236\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.35784313725490197,\n \"acc_stderr\": 0.019393058402355442,\n \"\
acc_norm\": 0.35784313725490197,\n \"acc_norm_stderr\": 0.019393058402355442\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.38181818181818183,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.38181818181818183,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40408163265306124,\n \"acc_stderr\": 0.031414708025865906,\n\
\ \"acc_norm\": 0.40408163265306124,\n \"acc_norm_stderr\": 0.031414708025865906\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5223880597014925,\n\
\ \"acc_stderr\": 0.03531987930208731,\n \"acc_norm\": 0.5223880597014925,\n\
\ \"acc_norm_stderr\": 0.03531987930208731\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5029239766081871,\n \"acc_stderr\": 0.03834759370936839,\n\
\ \"acc_norm\": 0.5029239766081871,\n \"acc_norm_stderr\": 0.03834759370936839\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.42683291459811795,\n\
\ \"mc2_stderr\": 0.015100542524354723\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6085240726124704,\n \"acc_stderr\": 0.013717487071290852\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05534495830174375,\n \
\ \"acc_stderr\": 0.006298221796179599\n }\n}\n```"
repo_url: https://huggingface.co/silvainrichou/gemma-3b-002
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|arc:challenge|25_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|gsm8k|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hellaswag|10_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T14-09-20.686818.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T14-09-20.686818.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- '**/details_harness|winogrande|5_2024-03-15T14-09-20.686818.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-15T14-09-20.686818.parquet'
- config_name: results
data_files:
- split: 2024_03_15T14_09_20.686818
path:
- results_2024-03-15T14-09-20.686818.parquet
- split: latest
path:
- results_2024-03-15T14-09-20.686818.parquet
---
# Dataset Card for Evaluation run of silvainrichou/gemma-3b-002
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [silvainrichou/gemma-3b-002](https://huggingface.co/silvainrichou/gemma-3b-002) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_silvainrichou__gemma-3b-002",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-15T14:09:20.686818](https://huggingface.co/datasets/open-llm-leaderboard/details_silvainrichou__gemma-3b-002/blob/main/results_2024-03-15T14-09-20.686818.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3698166825711143,
"acc_stderr": 0.033990488250744445,
"acc_norm": 0.3743393450227912,
"acc_norm_stderr": 0.03480240983497677,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.42683291459811795,
"mc2_stderr": 0.015100542524354723
},
"harness|arc:challenge|25": {
"acc": 0.4052901023890785,
"acc_stderr": 0.014346869060229328,
"acc_norm": 0.4334470989761092,
"acc_norm_stderr": 0.014481376224558898
},
"harness|hellaswag|10": {
"acc": 0.47769368651663013,
"acc_stderr": 0.004984813391016206,
"acc_norm": 0.6406094403505278,
"acc_norm_stderr": 0.004788412062375705
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03583496176361063,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03583496176361063
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4188679245283019,
"acc_stderr": 0.03036505082911521,
"acc_norm": 0.4188679245283019,
"acc_norm_stderr": 0.03036505082911521
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.375,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.03106898596312215,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.03106898596312215
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.041042692118062316,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.041042692118062316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113935,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113935
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4096774193548387,
"acc_stderr": 0.027976054915347357,
"acc_norm": 0.4096774193548387,
"acc_norm_stderr": 0.027976054915347357
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3251231527093596,
"acc_stderr": 0.03295797566311271,
"acc_norm": 0.3251231527093596,
"acc_norm_stderr": 0.03295797566311271
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03681050869161549,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03681050869161549
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4393939393939394,
"acc_stderr": 0.035360859475294805,
"acc_norm": 0.4393939393939394,
"acc_norm_stderr": 0.035360859475294805
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.41968911917098445,
"acc_stderr": 0.035615873276858834,
"acc_norm": 0.41968911917098445,
"acc_norm_stderr": 0.035615873276858834
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31794871794871793,
"acc_stderr": 0.023610884308927865,
"acc_norm": 0.31794871794871793,
"acc_norm_stderr": 0.023610884308927865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.42935779816513764,
"acc_stderr": 0.021222286397236508,
"acc_norm": 0.42935779816513764,
"acc_norm_stderr": 0.021222286397236508
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03099866630456053,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03099866630456053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.03384132045674119,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.03384132045674119
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3755274261603376,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.3755274261603376,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.452914798206278,
"acc_stderr": 0.03340867501923324,
"acc_norm": 0.452914798206278,
"acc_norm_stderr": 0.03340867501923324
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3816793893129771,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.3816793893129771,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5371900826446281,
"acc_stderr": 0.04551711196104218,
"acc_norm": 0.5371900826446281,
"acc_norm_stderr": 0.04551711196104218
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.39263803680981596,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.39263803680981596,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.4563106796116505,
"acc_stderr": 0.049318019942204146,
"acc_norm": 0.4563106796116505,
"acc_norm_stderr": 0.049318019942204146
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5299145299145299,
"acc_stderr": 0.032697411068124425,
"acc_norm": 0.5299145299145299,
"acc_norm_stderr": 0.032697411068124425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.45977011494252873,
"acc_stderr": 0.01782199409693354,
"acc_norm": 0.45977011494252873,
"acc_norm_stderr": 0.01782199409693354
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.38439306358381503,
"acc_stderr": 0.026189666966272035,
"acc_norm": 0.38439306358381503,
"acc_norm_stderr": 0.026189666966272035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2324022346368715,
"acc_stderr": 0.01412596875467338,
"acc_norm": 0.2324022346368715,
"acc_norm_stderr": 0.01412596875467338
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.39869281045751637,
"acc_stderr": 0.02803609227389176,
"acc_norm": 0.39869281045751637,
"acc_norm_stderr": 0.02803609227389176
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3890675241157556,
"acc_stderr": 0.027690337536485376,
"acc_norm": 0.3890675241157556,
"acc_norm_stderr": 0.027690337536485376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.44753086419753085,
"acc_stderr": 0.027667138569422694,
"acc_norm": 0.44753086419753085,
"acc_norm_stderr": 0.027667138569422694
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.02646903681859063,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.02646903681859063
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2894393741851369,
"acc_stderr": 0.011582659702210236,
"acc_norm": 0.2894393741851369,
"acc_norm_stderr": 0.011582659702210236
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.25,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35784313725490197,
"acc_stderr": 0.019393058402355442,
"acc_norm": 0.35784313725490197,
"acc_norm_stderr": 0.019393058402355442
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.38181818181818183,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.38181818181818183,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40408163265306124,
"acc_stderr": 0.031414708025865906,
"acc_norm": 0.40408163265306124,
"acc_norm_stderr": 0.031414708025865906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5223880597014925,
"acc_stderr": 0.03531987930208731,
"acc_norm": 0.5223880597014925,
"acc_norm_stderr": 0.03531987930208731
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5029239766081871,
"acc_stderr": 0.03834759370936839,
"acc_norm": 0.5029239766081871,
"acc_norm_stderr": 0.03834759370936839
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.42683291459811795,
"mc2_stderr": 0.015100542524354723
},
"harness|winogrande|5": {
"acc": 0.6085240726124704,
"acc_stderr": 0.013717487071290852
},
"harness|gsm8k|5": {
"acc": 0.05534495830174375,
"acc_stderr": 0.006298221796179599
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
adamo1139/AEZAKMI_v2 | ---
license: other
license_name: other
license_link: LICENSE
---
## New things changed in v2
- added some samples from spicy 3.1
- removed samples that were sending a shiver down RPers spines!
- removed some more GPTslop like "Remember to always..."
- added some jokes and duplicated some samples with two system prompts - "A chat." and "A chat with uncensored assistant.", so that hopefully model connects those two and act more freely.
## New things 2023-02-01
moved sharegpt version to a different repo to make it easier to use.
## New things 2023-12-17
I kneel. I don't like that we are using dataset format adopted from GPT, but it's pretty versatile. \
I added a jsonl in sharegpt format. \
I put in a script that I modified to do the conversion in this repo.
## Base information
This is dataset based on airoboros 2.2.1 with removed orca and gptslop samples. Models trained on this datasets are likely to hallucinate more than base airoboros since I also removed a lot of samples that made the model aware that it's not a human but an ai and it doesn't have physical body. The plus of that is that non-llama model trained on it should very rarely if ever issue a refusal. It also should sound more like a person than a sterile gpt-4. I can't guarantee for that to happen with llama 2 base models since they are pre-trained with gptslop and refusals. If you see a model that was trained on this dataset generating refusals, let me know and I will try to fix that. I removed jokes from airoboros 2.2.1 that I used as base and put in jokes from airoboros 2.2, as jokes from 2.2.1 were really lame.
Yi-34B 200K fine-tune on this dataset has been published, I don't think there was any interest in AEZAKMI Mistal v1, so I don't know if it makes sense to train one now. \
I will try to focus now on preparing DPO dataset that will decontaminate raw models that were trained on OpenAI data.
License: same as airoboros 2.2.1/airoboros 2.2/ spicy 3.1 |
liuyanchen1015/MULTI_VALUE_mnli_transitive_suffix | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 1107843
num_examples: 4951
- name: dev_mismatched
num_bytes: 1275786
num_examples: 5520
- name: test_matched
num_bytes: 1145748
num_examples: 5093
- name: test_mismatched
num_bytes: 1248740
num_examples: 5455
- name: train
num_bytes: 45534837
num_examples: 201275
download_size: 32518730
dataset_size: 50312954
---
# Dataset Card for "MULTI_VALUE_mnli_transitive_suffix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thermostatic/mistral_platypus | ---
license: mit
---
This is a simplified Open-Platypus dataset, ready for training on a Mistral 7B model. |
sinhala-nlp/NSINA-Media | ---
license: cc-by-sa-4.0
task_categories:
- text-classification
language:
- si
--- |
versae/bibles | ---
language:
- sq
- ar
- az
- be
- bg
- ceb
- zh
- cs
- da
- en
- es
- fi
- fr
- de
- el
- ht
- he
- hi
- hu
- it
- ko
- la
- nl
- no
- pt
- rm
- ru
- sw
- ta
- th
- tr
- vi
--- |
open-llm-leaderboard/details_Sao10K__SthenoWriter-L2-13B | ---
pretty_name: Evaluation run of Sao10K/SthenoWriter-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/SthenoWriter-L2-13B](https://huggingface.co/Sao10K/SthenoWriter-L2-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__SthenoWriter-L2-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T23:46:14.496615](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__SthenoWriter-L2-13B/blob/main/results_2023-10-24T23-46-14.496615.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002307046979865772,\n\
\ \"em_stderr\": 0.0004913221265094507,\n \"f1\": 0.06478397651006729,\n\
\ \"f1_stderr\": 0.001425510190369328,\n \"acc\": 0.4278473862370922,\n\
\ \"acc_stderr\": 0.010483695573501171\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002307046979865772,\n \"em_stderr\": 0.0004913221265094507,\n\
\ \"f1\": 0.06478397651006729,\n \"f1_stderr\": 0.001425510190369328\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11220621683093253,\n \
\ \"acc_stderr\": 0.008693743138242354\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759987\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Sao10K/SthenoWriter-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|arc:challenge|25_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T23_46_14.496615
path:
- '**/details_harness|drop|3_2023-10-24T23-46-14.496615.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T23-46-14.496615.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T23_46_14.496615
path:
- '**/details_harness|gsm8k|5_2023-10-24T23-46-14.496615.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T23-46-14.496615.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hellaswag|10_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T09-10-08.992646.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T09-10-08.992646.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T09-10-08.992646.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T23_46_14.496615
path:
- '**/details_harness|winogrande|5_2023-10-24T23-46-14.496615.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T23-46-14.496615.parquet'
- config_name: results
data_files:
- split: 2023_10_04T09_10_08.992646
path:
- results_2023-10-04T09-10-08.992646.parquet
- split: 2023_10_24T23_46_14.496615
path:
- results_2023-10-24T23-46-14.496615.parquet
- split: latest
path:
- results_2023-10-24T23-46-14.496615.parquet
---
# Dataset Card for Evaluation run of Sao10K/SthenoWriter-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/SthenoWriter-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/SthenoWriter-L2-13B](https://huggingface.co/Sao10K/SthenoWriter-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__SthenoWriter-L2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T23:46:14.496615](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__SthenoWriter-L2-13B/blob/main/results_2023-10-24T23-46-14.496615.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094507,
"f1": 0.06478397651006729,
"f1_stderr": 0.001425510190369328,
"acc": 0.4278473862370922,
"acc_stderr": 0.010483695573501171
},
"harness|drop|3": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094507,
"f1": 0.06478397651006729,
"f1_stderr": 0.001425510190369328
},
"harness|gsm8k|5": {
"acc": 0.11220621683093253,
"acc_stderr": 0.008693743138242354
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759987
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
adityarra07/aug_train_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 364677474.1
num_examples: 2700
- name: test
num_bytes: 36864625.0
num_examples: 300
download_size: 395989646
dataset_size: 401542099.1
---
# Dataset Card for "aug_train1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
debarshi/culturax-hi-higgsfield | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2058561327
num_examples: 280000
download_size: 781815845
dataset_size: 2058561327
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat | ---
pretty_name: Evaluation run of AIGym/deepseek-coder-1.3b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AIGym/deepseek-coder-1.3b-chat](https://huggingface.co/AIGym/deepseek-coder-1.3b-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-03T15:27:05.050992](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat/blob/main/results_2024-02-03T15-27-05.050992.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.256701988196126,\n\
\ \"acc_stderr\": 0.030917309603694414,\n \"acc_norm\": 0.2577806846513683,\n\
\ \"acc_norm_stderr\": 0.031657483876574605,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.43935436863671357,\n\
\ \"mc2_stderr\": 0.01489259237232499\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22610921501706485,\n \"acc_stderr\": 0.012224202097063286,\n\
\ \"acc_norm\": 0.25597269624573377,\n \"acc_norm_stderr\": 0.012753013241244516\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3301135232025493,\n\
\ \"acc_stderr\": 0.004692926794268453,\n \"acc_norm\": 0.39693288189603665,\n\
\ \"acc_norm_stderr\": 0.004882619484166603\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n\
\ \"acc_stderr\": 0.039992628766177214,\n \"acc_norm\": 0.3111111111111111,\n\
\ \"acc_norm_stderr\": 0.039992628766177214\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.15789473684210525,\n \"acc_stderr\": 0.02967416752010146,\n\
\ \"acc_norm\": 0.15789473684210525,\n \"acc_norm_stderr\": 0.02967416752010146\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.027834912527544074,\n\
\ \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.027834912527544074\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\"\
: 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1907514450867052,\n\
\ \"acc_stderr\": 0.029957851329869334,\n \"acc_norm\": 0.1907514450867052,\n\
\ \"acc_norm_stderr\": 0.029957851329869334\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.029644006577009618,\n\
\ \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.029644006577009618\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281337,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281337\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.038552896163789485,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.038552896163789485\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491841,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491841\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2161290322580645,\n \"acc_stderr\": 0.02341529343356853,\n \"\
acc_norm\": 0.2161290322580645,\n \"acc_norm_stderr\": 0.02341529343356853\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n \"\
acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20202020202020202,\n \"acc_stderr\": 0.028606204289229886,\n \"\
acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.028606204289229886\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.02075242372212802,\n \
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.02075242372212802\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073817,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073817\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.19747899159663865,\n \"acc_stderr\": 0.025859164122051463,\n\
\ \"acc_norm\": 0.19747899159663865,\n \"acc_norm_stderr\": 0.025859164122051463\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987054,\n \"\
acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987054\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23486238532110093,\n \"acc_stderr\": 0.018175110510343574,\n \"\
acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.018175110510343574\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28431372549019607,\n \"acc_stderr\": 0.03166009679399812,\n \"\
acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.03166009679399812\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25316455696202533,\n \"acc_stderr\": 0.02830465794303529,\n \
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.02830465794303529\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.26905829596412556,\n\
\ \"acc_stderr\": 0.029763779406874965,\n \"acc_norm\": 0.26905829596412556,\n\
\ \"acc_norm_stderr\": 0.029763779406874965\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.029343114798094455,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.029343114798094455\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25798212005108556,\n\
\ \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.25798212005108556,\n\
\ \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824775,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824775\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.26366559485530544,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27183833116036504,\n\
\ \"acc_stderr\": 0.011363135278651414,\n \"acc_norm\": 0.27183833116036504,\n\
\ \"acc_norm_stderr\": 0.011363135278651414\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.029520095697687758,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.029520095697687758\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.01774089950917779,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.01774089950917779\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.025607375986579153,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.025607375986579153\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.22289156626506024,\n\
\ \"acc_stderr\": 0.0324000482559469,\n \"acc_norm\": 0.22289156626506024,\n\
\ \"acc_norm_stderr\": 0.0324000482559469\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.43935436863671357,\n\
\ \"mc2_stderr\": 0.01489259237232499\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5146014206787688,\n \"acc_stderr\": 0.01404649238327584\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03184230477634572,\n \
\ \"acc_stderr\": 0.0048363485582609035\n }\n}\n```"
repo_url: https://huggingface.co/AIGym/deepseek-coder-1.3b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|arc:challenge|25_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|arc:challenge|25_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|gsm8k|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|gsm8k|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hellaswag|10_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hellaswag|10_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T15-09-58.075482.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T15-27-05.050992.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T15-27-05.050992.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- '**/details_harness|winogrande|5_2024-02-03T15-09-58.075482.parquet'
- split: 2024_02_03T15_27_05.050992
path:
- '**/details_harness|winogrande|5_2024-02-03T15-27-05.050992.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-03T15-27-05.050992.parquet'
- config_name: results
data_files:
- split: 2024_02_03T15_09_58.075482
path:
- results_2024-02-03T15-09-58.075482.parquet
- split: 2024_02_03T15_27_05.050992
path:
- results_2024-02-03T15-27-05.050992.parquet
- split: latest
path:
- results_2024-02-03T15-27-05.050992.parquet
---
# Dataset Card for Evaluation run of AIGym/deepseek-coder-1.3b-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AIGym/deepseek-coder-1.3b-chat](https://huggingface.co/AIGym/deepseek-coder-1.3b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T15:27:05.050992](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat/blob/main/results_2024-02-03T15-27-05.050992.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.256701988196126,
"acc_stderr": 0.030917309603694414,
"acc_norm": 0.2577806846513683,
"acc_norm_stderr": 0.031657483876574605,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.43935436863671357,
"mc2_stderr": 0.01489259237232499
},
"harness|arc:challenge|25": {
"acc": 0.22610921501706485,
"acc_stderr": 0.012224202097063286,
"acc_norm": 0.25597269624573377,
"acc_norm_stderr": 0.012753013241244516
},
"harness|hellaswag|10": {
"acc": 0.3301135232025493,
"acc_stderr": 0.004692926794268453,
"acc_norm": 0.39693288189603665,
"acc_norm_stderr": 0.004882619484166603
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.039992628766177214,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.039992628766177214
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.15789473684210525,
"acc_stderr": 0.02967416752010146,
"acc_norm": 0.15789473684210525,
"acc_norm_stderr": 0.02967416752010146
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.027834912527544074,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.027834912527544074
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1907514450867052,
"acc_stderr": 0.029957851329869334,
"acc_norm": 0.1907514450867052,
"acc_norm_stderr": 0.029957851329869334
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.029644006577009618,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.029644006577009618
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281337,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281337
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.038552896163789485,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.038552896163789485
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02256989707491841,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02256989707491841
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2161290322580645,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.2161290322580645,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.028606204289229886,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.028606204289229886
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.02075242372212802,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.02075242372212802
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073817,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073817
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19747899159663865,
"acc_stderr": 0.025859164122051463,
"acc_norm": 0.19747899159663865,
"acc_norm_stderr": 0.025859164122051463
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.03445406271987054,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.03445406271987054
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23486238532110093,
"acc_stderr": 0.018175110510343574,
"acc_norm": 0.23486238532110093,
"acc_norm_stderr": 0.018175110510343574
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.03166009679399812,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.03166009679399812
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.02830465794303529,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.02830465794303529
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.26905829596412556,
"acc_stderr": 0.029763779406874965,
"acc_norm": 0.26905829596412556,
"acc_norm_stderr": 0.029763779406874965
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.029343114798094455,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.029343114798094455
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.25798212005108556,
"acc_stderr": 0.01564583018834895,
"acc_norm": 0.25798212005108556,
"acc_norm_stderr": 0.01564583018834895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090201,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090201
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27183833116036504,
"acc_stderr": 0.011363135278651414,
"acc_norm": 0.27183833116036504,
"acc_norm_stderr": 0.011363135278651414
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.029520095697687758,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.029520095697687758
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.01774089950917779,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.01774089950917779
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2,
"acc_stderr": 0.025607375986579153,
"acc_norm": 0.2,
"acc_norm_stderr": 0.025607375986579153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.0324000482559469,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.0324000482559469
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.43935436863671357,
"mc2_stderr": 0.01489259237232499
},
"harness|winogrande|5": {
"acc": 0.5146014206787688,
"acc_stderr": 0.01404649238327584
},
"harness|gsm8k|5": {
"acc": 0.03184230477634572,
"acc_stderr": 0.0048363485582609035
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
huggingartists/lil-uzi-vert | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/lil-uzi-vert"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 1.837334 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/73f52f6c73859a68ab961ca797e7b848.725x725x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/lil-uzi-vert">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Uzi Vert</div>
<a href="https://genius.com/artists/lil-uzi-vert">
<div style="text-align: center; font-size: 14px;">@lil-uzi-vert</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/lil-uzi-vert).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lil-uzi-vert")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|845| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/lil-uzi-vert")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
DonGenialo/pixel_images_354 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 32413851.0
num_examples: 354
download_size: 29491789
dataset_size: 32413851.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_ChuckMcSneed__WinterGoddess-1.4x-70b-32k | ---
pretty_name: Evaluation run of ChuckMcSneed/WinterGoddess-1.4x-70b-32k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ChuckMcSneed/WinterGoddess-1.4x-70b-32k](https://huggingface.co/ChuckMcSneed/WinterGoddess-1.4x-70b-32k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChuckMcSneed__WinterGoddess-1.4x-70b-32k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T22:05:42.684950](https://huggingface.co/datasets/open-llm-leaderboard/details_ChuckMcSneed__WinterGoddess-1.4x-70b-32k/blob/main/results_2024-02-02T22-05-42.684950.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6639765808635061,\n\
\ \"acc_stderr\": 0.03149901214232908,\n \"acc_norm\": 0.6688177091509891,\n\
\ \"acc_norm_stderr\": 0.03212294826804706,\n \"mc1\": 0.4700122399020808,\n\
\ \"mc1_stderr\": 0.01747199209169754,\n \"mc2\": 0.6387436130479125,\n\
\ \"mc2_stderr\": 0.014303842525660086\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6732081911262798,\n \"acc_stderr\": 0.013706665975587338,\n\
\ \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428171\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7134037044413464,\n\
\ \"acc_stderr\": 0.004512471612415591,\n \"acc_norm\": 0.8911571400119498,\n\
\ \"acc_norm_stderr\": 0.0031080545633521087\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781668,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781668\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062164,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062164\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887058,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887058\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"\
acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8788990825688073,\n \"acc_stderr\": 0.013987618292389713,\n \"\
acc_norm\": 0.8788990825688073,\n \"acc_norm_stderr\": 0.013987618292389713\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8872549019607843,\n \"acc_stderr\": 0.02219857103945679,\n \"\
acc_norm\": 0.8872549019607843,\n \"acc_norm_stderr\": 0.02219857103945679\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \
\ \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n\
\ \"acc_stderr\": 0.027373095500540193,\n \"acc_norm\": 0.7892376681614349,\n\
\ \"acc_norm_stderr\": 0.027373095500540193\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159462,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159462\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547128,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547128\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.5892857142857143,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.019875655027867443,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.019875655027867443\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608303,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608303\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n\
\ \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33631284916201115,\n\
\ \"acc_stderr\": 0.01580100372914589,\n \"acc_norm\": 0.33631284916201115,\n\
\ \"acc_norm_stderr\": 0.01580100372914589\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046633,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046633\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n\
\ \"acc_stderr\": 0.024723861504771693,\n \"acc_norm\": 0.7459807073954984,\n\
\ \"acc_norm_stderr\": 0.024723861504771693\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.02963483847376601,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.02963483847376601\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5625814863102999,\n\
\ \"acc_stderr\": 0.01266981346493572,\n \"acc_norm\": 0.5625814863102999,\n\
\ \"acc_norm_stderr\": 0.01266981346493572\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039656,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039656\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7401960784313726,\n \"acc_stderr\": 0.017740899509177795,\n \
\ \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.017740899509177795\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n\
\ \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036847,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4700122399020808,\n\
\ \"mc1_stderr\": 0.01747199209169754,\n \"mc2\": 0.6387436130479125,\n\
\ \"mc2_stderr\": 0.014303842525660086\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498438\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.43290371493555724,\n \
\ \"acc_stderr\": 0.013647916362576056\n }\n}\n```"
repo_url: https://huggingface.co/ChuckMcSneed/WinterGoddess-1.4x-70b-32k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|arc:challenge|25_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|gsm8k|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hellaswag|10_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T22-05-42.684950.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T22-05-42.684950.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- '**/details_harness|winogrande|5_2024-02-02T22-05-42.684950.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T22-05-42.684950.parquet'
- config_name: results
data_files:
- split: 2024_02_02T22_05_42.684950
path:
- results_2024-02-02T22-05-42.684950.parquet
- split: latest
path:
- results_2024-02-02T22-05-42.684950.parquet
---
# Dataset Card for Evaluation run of ChuckMcSneed/WinterGoddess-1.4x-70b-32k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ChuckMcSneed/WinterGoddess-1.4x-70b-32k](https://huggingface.co/ChuckMcSneed/WinterGoddess-1.4x-70b-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ChuckMcSneed__WinterGoddess-1.4x-70b-32k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T22:05:42.684950](https://huggingface.co/datasets/open-llm-leaderboard/details_ChuckMcSneed__WinterGoddess-1.4x-70b-32k/blob/main/results_2024-02-02T22-05-42.684950.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6639765808635061,
"acc_stderr": 0.03149901214232908,
"acc_norm": 0.6688177091509891,
"acc_norm_stderr": 0.03212294826804706,
"mc1": 0.4700122399020808,
"mc1_stderr": 0.01747199209169754,
"mc2": 0.6387436130479125,
"mc2_stderr": 0.014303842525660086
},
"harness|arc:challenge|25": {
"acc": 0.6732081911262798,
"acc_stderr": 0.013706665975587338,
"acc_norm": 0.71160409556314,
"acc_norm_stderr": 0.013238394422428171
},
"harness|hellaswag|10": {
"acc": 0.7134037044413464,
"acc_stderr": 0.004512471612415591,
"acc_norm": 0.8911571400119498,
"acc_norm_stderr": 0.0031080545633521087
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059006,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059006
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781668,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781668
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062164,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062164
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887058,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887058
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247443,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247443
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8788990825688073,
"acc_stderr": 0.013987618292389713,
"acc_norm": 0.8788990825688073,
"acc_norm_stderr": 0.013987618292389713
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8872549019607843,
"acc_stderr": 0.02219857103945679,
"acc_norm": 0.8872549019607843,
"acc_norm_stderr": 0.02219857103945679
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.027373095500540193,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.027373095500540193
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159462,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159462
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547128,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547128
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867443,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867443
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608303,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608303
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33631284916201115,
"acc_stderr": 0.01580100372914589,
"acc_norm": 0.33631284916201115,
"acc_norm_stderr": 0.01580100372914589
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046633,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046633
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.024723861504771693,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.024723861504771693
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.02963483847376601,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.02963483847376601
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5625814863102999,
"acc_stderr": 0.01266981346493572,
"acc_norm": 0.5625814863102999,
"acc_norm_stderr": 0.01266981346493572
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.02850145286039656,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.02850145286039656
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.017740899509177795,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.017740899509177795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4700122399020808,
"mc1_stderr": 0.01747199209169754,
"mc2": 0.6387436130479125,
"mc2_stderr": 0.014303842525660086
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498438
},
"harness|gsm8k|5": {
"acc": 0.43290371493555724,
"acc_stderr": 0.013647916362576056
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
indiehackers/aya_telugu_instruction | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: language
dtype: string
- name: language_code
dtype: string
- name: annotation_type
dtype: string
- name: user_id
dtype: string
splits:
- name: train
num_bytes: 10621418.962789824
num_examples: 8439
- name: test
num_bytes: 254601.14285714287
num_examples: 250
download_size: 7244064
dataset_size: 10876020.105646968
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
krvhrv/prompts_30percent | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 69682787.96191673
num_examples: 73632
download_size: 88536922
dataset_size: 69682787.96191673
---
# Dataset Card for "prompts_30percent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ciaranmacseoin/Donut_SORIE | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 159911028.0
num_examples: 626
- name: test
num_bytes: 47007392.0
num_examples: 173
- name: validation
num_bytes: 48663050.0
num_examples: 174
download_size: 187781935
dataset_size: 255581470.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
ChaiML/chaiverse_lora_testing_fandom_IO | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
splits:
- name: train
num_bytes: 149058
num_examples: 100
download_size: 96520
dataset_size: 149058
---
# Dataset Card for "chaiverse_lora_testing_fandom_IO"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DarqueDante/merged | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: text_token_length
dtype: int64
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 8764212493
num_examples: 1949895
download_size: 4436537749
dataset_size: 8764212493
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tnpb/breas-cancer-wisconsin-kaggle | ---
license: mit
---
|
open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT-qlora-eli5-wiki_DPO_ds_RM_top_2_1024_r_64_alpha_16 | ---
pretty_name: Evaluation run of dhmeltzer/llama-7b-SFT-qlora-eli5-wiki_DPO_ds_RM_top_2_1024_r_64_alpha_16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dhmeltzer/llama-7b-SFT-qlora-eli5-wiki_DPO_ds_RM_top_2_1024_r_64_alpha_16](https://huggingface.co/dhmeltzer/llama-7b-SFT-qlora-eli5-wiki_DPO_ds_RM_top_2_1024_r_64_alpha_16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT-qlora-eli5-wiki_DPO_ds_RM_top_2_1024_r_64_alpha_16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-14T22:37:43.951322](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT-qlora-eli5-wiki_DPO_ds_RM_top_2_1024_r_64_alpha_16/blob/main/results_2023-10-14T22-37-43.951322.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.01772231543624161,\n\
\ \"em_stderr\": 0.0013511918633877466,\n \"f1\": 0.0835318791946305,\n\
\ \"f1_stderr\": 0.001975755385564516,\n \"acc\": 0.3909370843114387,\n\
\ \"acc_stderr\": 0.00906125347819033\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.01772231543624161,\n \"em_stderr\": 0.0013511918633877466,\n\
\ \"f1\": 0.0835318791946305,\n \"f1_stderr\": 0.001975755385564516\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.045489006823351025,\n \
\ \"acc_stderr\": 0.005739657656722189\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7363851617995264,\n \"acc_stderr\": 0.012382849299658468\n\
\ }\n}\n```"
repo_url: https://huggingface.co/dhmeltzer/llama-7b-SFT-qlora-eli5-wiki_DPO_ds_RM_top_2_1024_r_64_alpha_16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|arc:challenge|25_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_14T22_37_43.951322
path:
- '**/details_harness|drop|3_2023-10-14T22-37-43.951322.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-14T22-37-43.951322.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_14T22_37_43.951322
path:
- '**/details_harness|gsm8k|5_2023-10-14T22-37-43.951322.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-14T22-37-43.951322.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hellaswag|10_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-07T13-33-12.339824.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-07T13-33-12.339824.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-07T13-33-12.339824.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_14T22_37_43.951322
path:
- '**/details_harness|winogrande|5_2023-10-14T22-37-43.951322.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-14T22-37-43.951322.parquet'
- config_name: results
data_files:
- split: 2023_09_07T13_33_12.339824
path:
- results_2023-09-07T13-33-12.339824.parquet
- split: 2023_10_14T22_37_43.951322
path:
- results_2023-10-14T22-37-43.951322.parquet
- split: latest
path:
- results_2023-10-14T22-37-43.951322.parquet
---
# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT-qlora-eli5-wiki_DPO_ds_RM_top_2_1024_r_64_alpha_16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/llama-7b-SFT-qlora-eli5-wiki_DPO_ds_RM_top_2_1024_r_64_alpha_16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/llama-7b-SFT-qlora-eli5-wiki_DPO_ds_RM_top_2_1024_r_64_alpha_16](https://huggingface.co/dhmeltzer/llama-7b-SFT-qlora-eli5-wiki_DPO_ds_RM_top_2_1024_r_64_alpha_16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT-qlora-eli5-wiki_DPO_ds_RM_top_2_1024_r_64_alpha_16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-14T22:37:43.951322](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT-qlora-eli5-wiki_DPO_ds_RM_top_2_1024_r_64_alpha_16/blob/main/results_2023-10-14T22-37-43.951322.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.01772231543624161,
"em_stderr": 0.0013511918633877466,
"f1": 0.0835318791946305,
"f1_stderr": 0.001975755385564516,
"acc": 0.3909370843114387,
"acc_stderr": 0.00906125347819033
},
"harness|drop|3": {
"em": 0.01772231543624161,
"em_stderr": 0.0013511918633877466,
"f1": 0.0835318791946305,
"f1_stderr": 0.001975755385564516
},
"harness|gsm8k|5": {
"acc": 0.045489006823351025,
"acc_stderr": 0.005739657656722189
},
"harness|winogrande|5": {
"acc": 0.7363851617995264,
"acc_stderr": 0.012382849299658468
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
izumi-lab/open-text-books | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 281723992
num_examples: 149700
download_size: 152345811
dataset_size: 281723992
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-sa-4.0
language:
- en
---
# Dataset Card for "open-text-books"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_amu__orpo-lora-phi2 | ---
pretty_name: Evaluation run of amu/orpo-lora-phi2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [amu/orpo-lora-phi2](https://huggingface.co/amu/orpo-lora-phi2) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_amu__orpo-lora-phi2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-31T09:12:19.501086](https://huggingface.co/datasets/open-llm-leaderboard/details_amu__orpo-lora-phi2/blob/main/results_2024-03-31T09-12-19.501086.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5821416277179048,\n\
\ \"acc_stderr\": 0.033770146849493435,\n \"acc_norm\": 0.5843594794170675,\n\
\ \"acc_norm_stderr\": 0.03445851343693101,\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.01615020132132302,\n \"mc2\": 0.44496168517736456,\n\
\ \"mc2_stderr\": 0.014885989610479747\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558902,\n\
\ \"acc_norm\": 0.6032423208191127,\n \"acc_norm_stderr\": 0.014296513020180642\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5540728938458475,\n\
\ \"acc_stderr\": 0.004960516570284905,\n \"acc_norm\": 0.7457677753435571,\n\
\ \"acc_norm_stderr\": 0.004345388614520023\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296563,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296563\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n \
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4497354497354497,\n \"acc_stderr\": 0.02562085704293665,\n \"\
acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.02562085704293665\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6935483870967742,\n\
\ \"acc_stderr\": 0.026226485652553883,\n \"acc_norm\": 0.6935483870967742,\n\
\ \"acc_norm_stderr\": 0.026226485652553883\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.037818873532059816,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.037818873532059816\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365907,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365907\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.029252823291803624,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.029252823291803624\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878944,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878944\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150013,\n\
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150013\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630804,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6519607843137255,\n \"acc_stderr\": 0.03343311240488419,\n \"\
acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.03343311240488419\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293433,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293433\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483706,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483706\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6845466155810983,\n\
\ \"acc_stderr\": 0.01661750173876339,\n \"acc_norm\": 0.6845466155810983,\n\
\ \"acc_norm_stderr\": 0.01661750173876339\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165538,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165538\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\
\ \"acc_stderr\": 0.015366860386397112,\n \"acc_norm\": 0.3027932960893855,\n\
\ \"acc_norm_stderr\": 0.015366860386397112\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283697,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283697\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\
\ \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n\
\ \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.02704453813840259,\n\
\ \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.02704453813840259\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766002,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766002\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4172099087353325,\n\
\ \"acc_stderr\": 0.012593959992906419,\n \"acc_norm\": 0.4172099087353325,\n\
\ \"acc_norm_stderr\": 0.012593959992906419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904611,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904611\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5473856209150327,\n \"acc_stderr\": 0.020136790918492527,\n \
\ \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.020136790918492527\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.01615020132132302,\n \"mc2\": 0.44496168517736456,\n\
\ \"mc2_stderr\": 0.014885989610479747\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7371744277821626,\n \"acc_stderr\": 0.012370922527262011\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5246398786959818,\n \
\ \"acc_stderr\": 0.013755751352764915\n }\n}\n```"
repo_url: https://huggingface.co/amu/orpo-lora-phi2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|arc:challenge|25_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|gsm8k|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hellaswag|10_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T09-12-19.501086.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T09-12-19.501086.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- '**/details_harness|winogrande|5_2024-03-31T09-12-19.501086.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-31T09-12-19.501086.parquet'
- config_name: results
data_files:
- split: 2024_03_31T09_12_19.501086
path:
- results_2024-03-31T09-12-19.501086.parquet
- split: latest
path:
- results_2024-03-31T09-12-19.501086.parquet
---
# Dataset Card for Evaluation run of amu/orpo-lora-phi2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [amu/orpo-lora-phi2](https://huggingface.co/amu/orpo-lora-phi2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_amu__orpo-lora-phi2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-31T09:12:19.501086](https://huggingface.co/datasets/open-llm-leaderboard/details_amu__orpo-lora-phi2/blob/main/results_2024-03-31T09-12-19.501086.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5821416277179048,
"acc_stderr": 0.033770146849493435,
"acc_norm": 0.5843594794170675,
"acc_norm_stderr": 0.03445851343693101,
"mc1": 0.3072215422276622,
"mc1_stderr": 0.01615020132132302,
"mc2": 0.44496168517736456,
"mc2_stderr": 0.014885989610479747
},
"harness|arc:challenge|25": {
"acc": 0.5665529010238908,
"acc_stderr": 0.014481376224558902,
"acc_norm": 0.6032423208191127,
"acc_norm_stderr": 0.014296513020180642
},
"harness|hellaswag|10": {
"acc": 0.5540728938458475,
"acc_stderr": 0.004960516570284905,
"acc_norm": 0.7457677753435571,
"acc_norm_stderr": 0.004345388614520023
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296563,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296563
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4497354497354497,
"acc_stderr": 0.02562085704293665,
"acc_norm": 0.4497354497354497,
"acc_norm_stderr": 0.02562085704293665
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6935483870967742,
"acc_stderr": 0.026226485652553883,
"acc_norm": 0.6935483870967742,
"acc_norm_stderr": 0.026226485652553883
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.037818873532059816,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.037818873532059816
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365907,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365907
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.029252823291803624,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.029252823291803624
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878944,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878944
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.031204691225150013,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.031204691225150013
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630804,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.03343311240488419,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.03343311240488419
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293433,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293433
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483706,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483706
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6845466155810983,
"acc_stderr": 0.01661750173876339,
"acc_norm": 0.6845466155810983,
"acc_norm_stderr": 0.01661750173876339
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165538,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165538
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.015366860386397112,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.015366860386397112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283697,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283697
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.02751392568354943,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.02751392568354943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6172839506172839,
"acc_stderr": 0.02704453813840259,
"acc_norm": 0.6172839506172839,
"acc_norm_stderr": 0.02704453813840259
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766002,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766002
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4172099087353325,
"acc_stderr": 0.012593959992906419,
"acc_norm": 0.4172099087353325,
"acc_norm_stderr": 0.012593959992906419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5473856209150327,
"acc_stderr": 0.020136790918492527,
"acc_norm": 0.5473856209150327,
"acc_norm_stderr": 0.020136790918492527
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3072215422276622,
"mc1_stderr": 0.01615020132132302,
"mc2": 0.44496168517736456,
"mc2_stderr": 0.014885989610479747
},
"harness|winogrande|5": {
"acc": 0.7371744277821626,
"acc_stderr": 0.012370922527262011
},
"harness|gsm8k|5": {
"acc": 0.5246398786959818,
"acc_stderr": 0.013755751352764915
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/rea_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of rea (Fire Emblem)
This is the dataset of rea (Fire Emblem), containing 500 images and their tags.
The core tags of this character are `long_hair, green_hair, green_eyes, breasts, hair_ornament, large_breasts, hair_flower, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 680.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rea_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 392.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rea_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1143 | 788.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rea_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 600.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rea_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1143 | 1.06 GiB | [Download](https://huggingface.co/datasets/CyberHarem/rea_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/rea_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, closed_mouth, long_sleeves, simple_background, solo, flower, white_dress, bare_shoulders, smile |
| 1 | 10 |  |  |  |  |  | 1girl, flower, solo, tiara, crown, closed_mouth, simple_background, smile, upper_body, white_background, portrait |
| 2 | 9 |  |  |  |  |  | 1girl, hair_ribbon, pointy_ears, ribbon_braid, side_braid, solo, tiara, twin_braids, closed_mouth, smile, simple_background, looking_at_viewer, upper_body |
| 3 | 8 |  |  |  |  |  | 1girl, barefoot, hair_ribbon, pointy_ears, ribbon_braid, solo, tiara, twin_braids, very_long_hair, blue_dress, anklet, floating_hair, full_body, armpits, side_braid, smile, sparkle, open_mouth |
| 4 | 6 |  |  |  |  |  | 1girl, fur_trim, gift_box, hair_ribbon, pointy_ears, ribbon_braid, solo, tiara, twin_braids, christmas_ornaments, smile, dress, holding, open_mouth, sack, side_braid |
| 5 | 11 |  |  |  |  |  | 1girl, cleavage, closed_mouth, flower, smile, solo, white_bikini, looking_at_viewer, navel, simple_background, white_background |
| 6 | 5 |  |  |  |  |  | 1girl, blue_sky, cleavage, closed_mouth, day, flower, navel, outdoors, white_bikini, official_alternate_costume, smile, beach, solo_focus, water, 1boy, cloud, holding_hands, ocean |
| 7 | 5 |  |  |  |  |  | 1girl, bare_shoulders, beach, blue_sky, blush, cleavage, closed_mouth, collarbone, day, looking_at_viewer, navel, ocean, outdoors, parted_bangs, solo, stomach, thighs, alternate_costume, cowboy_shot, sunlight, black_bikini, cloud, earrings, forehead, sand, thigh_gap, skindentation, smile, umbrella, very_long_hair, water |
| 8 | 6 |  |  |  |  |  | 2girls, cleavage, closed_mouth, flower, navel, thighs, white_bikini, holding, jewelry, legs, sandals, simple_background, smile, full_body, looking_at_viewer, solo_focus, toes, bare_shoulders, circlet, grey_background, official_alternate_costume |
| 9 | 5 |  |  |  |  |  | 1girl, circlet, collarbone, looking_at_viewer, navel, parted_bangs, solo, thighs, white_panties, white_shirt, blush, crop_top, flower, smile, cleavage, parted_lips, bare_shoulders, legs, long_sleeves, lying, off-shoulder_shirt, on_bed, tassel |
| 10 | 12 |  |  |  |  |  | 1girl, bare_shoulders, solo, cleavage, flower, parted_bangs, white_dress, blush, collarbone, looking_at_viewer, smile, circlet, thighs, sitting |
| 11 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blush, crop_top, long_sleeves, midriff, solo, circlet, closed_mouth, collarbone, flower, green_pants, high-waist_pants, looking_at_viewer, navel, parted_bangs, thighs, tight_pants, white_shirt, alternate_costume, cleavage, contemporary, off-shoulder_shirt, smile, yoga_pants, dated, hand_on_hip, simple_background, tassel |
| 12 | 22 |  |  |  |  |  | witch_hat, 1girl, solo, halloween_costume, official_alternate_costume, looking_at_viewer, smile, very_long_hair, long_sleeves, blue_dress, simple_background, wide_sleeves, collarbone, holding, blush, closed_mouth, hat_flower, long_dress |
| 13 | 10 |  |  |  |  |  | blush, completely_nude, 1girl, nipples, open_mouth, penis, uncensored, hetero, pussy, sex, vaginal, 1boy, solo_focus, pointy_ears, sweat, anus, artist_name, ass, english_text, flower, navel, spread_legs |
| 14 | 10 |  |  |  |  |  | 1girl, hetero, blush, flower, solo_focus, fellatio, mosaic_censoring, 1boy, cum, looking_at_viewer, nipples, pubic_hair, gangbang, handjob, huge_breasts, multiple_boys, multiple_penises |
| 15 | 5 |  |  |  |  |  | 1boy, 1girl, bar_censor, blush, flower, hetero, penis, alternate_hair_color, breast_sucking, gloved_handjob, huge_breasts, nursing_handjob, smile, tiara, ejaculation, nipples, short_hair, breastfeeding, closed_eyes, crown, grabbing, lactation, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | long_sleeves | simple_background | solo | flower | white_dress | bare_shoulders | smile | tiara | crown | upper_body | white_background | portrait | hair_ribbon | pointy_ears | ribbon_braid | side_braid | twin_braids | looking_at_viewer | barefoot | very_long_hair | blue_dress | anklet | floating_hair | full_body | armpits | sparkle | open_mouth | fur_trim | gift_box | christmas_ornaments | dress | holding | sack | cleavage | white_bikini | navel | blue_sky | day | outdoors | official_alternate_costume | beach | solo_focus | water | 1boy | cloud | holding_hands | ocean | blush | collarbone | parted_bangs | stomach | thighs | alternate_costume | cowboy_shot | sunlight | black_bikini | earrings | forehead | sand | thigh_gap | skindentation | umbrella | 2girls | jewelry | legs | sandals | toes | circlet | grey_background | white_panties | white_shirt | crop_top | parted_lips | lying | off-shoulder_shirt | on_bed | tassel | sitting | midriff | green_pants | high-waist_pants | tight_pants | contemporary | yoga_pants | dated | hand_on_hip | witch_hat | halloween_costume | wide_sleeves | hat_flower | long_dress | completely_nude | nipples | penis | uncensored | hetero | pussy | sex | vaginal | sweat | anus | artist_name | ass | english_text | spread_legs | fellatio | mosaic_censoring | cum | pubic_hair | gangbang | handjob | huge_breasts | multiple_boys | multiple_penises | bar_censor | alternate_hair_color | breast_sucking | gloved_handjob | nursing_handjob | ejaculation | short_hair | breastfeeding | closed_eyes | grabbing | lactation |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:---------------|:--------------------|:-------|:---------|:--------------|:-----------------|:--------|:--------|:--------|:-------------|:-------------------|:-----------|:--------------|:--------------|:---------------|:-------------|:--------------|:--------------------|:-----------|:-----------------|:-------------|:---------|:----------------|:------------|:----------|:----------|:-------------|:-----------|:-----------|:----------------------|:--------|:----------|:-------|:-----------|:---------------|:--------|:-----------|:------|:-----------|:-----------------------------|:--------|:-------------|:--------|:-------|:--------|:----------------|:--------|:--------|:-------------|:---------------|:----------|:---------|:--------------------|:--------------|:-----------|:---------------|:-----------|:-----------|:-------|:------------|:----------------|:-----------|:---------|:----------|:-------|:----------|:-------|:----------|:------------------|:----------------|:--------------|:-----------|:--------------|:--------|:---------------------|:---------|:---------|:----------|:----------|:--------------|:-------------------|:--------------|:---------------|:-------------|:--------|:--------------|:------------|:--------------------|:---------------|:-------------|:-------------|:------------------|:----------|:--------|:-------------|:---------|:--------|:------|:----------|:--------|:-------|:--------------|:------|:---------------|:--------------|:-----------|:-------------------|:------|:-------------|:-----------|:----------|:---------------|:----------------|:-------------------|:-------------|:-----------------------|:-----------------|:-----------------|:------------------|:--------------|:-------------|:----------------|:--------------|:-----------|:------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | | X | X | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | | X | X | | | | X | X | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | | | X | | | | X | X | | | | | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | | | X | | | | X | X | | | | | X | X | X | X | X | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | X | | X | X | X | | | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | | | X | | | X | X | | | | | | | | | | | X | | X | | | | | | | | | | | | | | X | | X | X | X | X | | X | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | | X | | X | | X | | X | X | | | | | | | | | | | X | | | | | | X | | | | | | | | X | | X | X | X | | | | X | | X | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | X | | X | X | | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | X | X | X | | X | | | | | | | | | | | | | X | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 12 |  |  |  |  |  | X | | | | X | X | X | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 5 |  |  |  |  |  | X | X | X | X | X | X | | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | X | X | X | | X | X | | | | | | | | | | | | | | | X | | | X | X | | | X | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 22 |  |  |  |  |  | X | X | X | X | X | | | | X | | | | | | | | | | | X | | X | X | | | | | | | | | | | X | | | | | | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 13 | 10 |  |  |  |  |  | X | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 14 | 10 |  |  |  |  |  | X | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 15 | 5 |  |  |  |  |  | X | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X |
|
sisi/audio_test | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 1520443.0
num_examples: 3
download_size: 1502791
dataset_size: 1520443.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zhangshuoming/numeric_synth_eval | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 254396
num_examples: 1000
download_size: 18594
dataset_size: 254396
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "numeric_synth_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
argmaxinc/librispeech | ---
license: cc-by-4.0
---
|
rajivkale/dataset-webhook-testing | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ADONIS
'1': AFRICAN GIANT SWALLOWTAIL
'2': AMERICAN SNOOT
splits:
- name: train
num_bytes: 8825732.0
num_examples: 338
download_size: 8823395
dataset_size: 8825732.0
---
# Dataset Card for "input-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NickyNicky/aya_dataset_multilingual_inputs_targets_ext3 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: language
dtype: string
- name: language_code
dtype: string
- name: targets_es
dtype: string
- name: targets_en
dtype: string
- name: targets_fr
dtype: string
- name: targets_de
dtype: string
- name: inputs_es
dtype: string
- name: inputs_en
dtype: string
- name: inputs_fr
dtype: string
- name: inputs_de
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2761582
num_examples: 1000
download_size: 1787156
dataset_size: 2761582
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
enzolutions/drupal | ---
license: apache-2.0
---
|
mwinn99/GPL6885 | ---
license: odbl
tags:
- biology
size_categories:
- 10K<n<100K
---
Original, raw data can be found in Gene Expression Omnibus (GEO) https://www.ncbi.nlm.nih.gov/geo/
## Citation
Winnicki MJ, Brown CA, Porter HL, Giles CB, Wren JD, BioVDB: biological vector database for high-throughput gene expression meta-analysis, Frontiers in Artificial Intelligence 7 (2024)
https://www.frontiersin.org/articles/10.3389/frai.2024.1366273 |
Prag12/PowerfulAssistantV4-Llama2-3.5kDemo | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2172588.2084095064
num_examples: 3500
- name: test
num_bytes: 121255.13320463321
num_examples: 191
download_size: 3557167
dataset_size: 2293843.3416141397
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
NobodyExistsOnTheInternet/gpt4mathsub463 | ---
license: mit
---
|
RafaelMPereira/HealthCareMagic-100k-Chat-Format-en | ---
license: apache-2.0
---
|
Ar4ikov/resd_studio | ---
dataset_info:
features:
- name: name
dtype: string
- name: path
dtype: string
- name: emotion
dtype: string
- name: speech
dtype: audio
splits:
- name: test
num_bytes: 96603538.0
num_examples: 280
- name: train
num_bytes: 398719157.336
num_examples: 1116
download_size: 485403675
dataset_size: 495322695.336
---
# Dataset Card for "resd_studio"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vishavjeetyadav144/muzz | ---
license: mit
---
|
mmbazel/Taylor-Swift-Example | ---
license: apache-2.0
---
Note: This is a copy of https://www.kaggle.com/datasets/thespacefreak/taylor-swift-song-lyrics-all-albums that I'm hosting over here for convenience for a workshop |
ftopal/huggingface-datasets-raw | ---
dataset_info:
features:
- name: sha
dtype: string
- name: text
dtype: string
- name: id
dtype: string
- name: tags
sequence: string
- name: created_at
dtype: string
- name: metadata
dtype: string
- name: last_modified
dtype: string
splits:
- name: train
num_bytes: 475940888
num_examples: 108754
download_size: 107845512
dataset_size: 475940888
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sethapun/arithmetic_2as_1to100 | ---
dataset_info:
features:
- name: expression
dtype: string
- name: answer
dtype: int64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 57764
num_examples: 2000
- name: validation
num_bytes: 11544
num_examples: 400
download_size: 19645
dataset_size: 69308
---
# Dataset Card for "arithmetic_2as_1to100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
likhithnemani/train_dataset | ---
dataset_info:
features:
- name: File Names
dtype: string
- name: Project Description
dtype: string
- name: Repo Name
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 18766639
num_examples: 1460
download_size: 4015356
dataset_size: 18766639
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_FINETUNE2_3w-q_k_v_o_proj | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w-q_k_v_o_proj
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w-q_k_v_o_proj](https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w-q_k_v_o_proj)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_FINETUNE2_3w-q_k_v_o_proj\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T02:00:22.389736](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_FINETUNE2_3w-q_k_v_o_proj/blob/main/results_2023-10-18T02-00-22.389736.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3719588926174497,\n\
\ \"em_stderr\": 0.004949726013193945,\n \"f1\": 0.4084679110738261,\n\
\ \"f1_stderr\": 0.004843145937750956,\n \"acc\": 0.4480415851620389,\n\
\ \"acc_stderr\": 0.010535274120903989\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3719588926174497,\n \"em_stderr\": 0.004949726013193945,\n\
\ \"f1\": 0.4084679110738261,\n \"f1_stderr\": 0.004843145937750956\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1281273692191054,\n \
\ \"acc_stderr\": 0.009206398549980031\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827948\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w-q_k_v_o_proj
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|arc:challenge|25_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T02_00_22.389736
path:
- '**/details_harness|drop|3_2023-10-18T02-00-22.389736.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T02-00-22.389736.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T02_00_22.389736
path:
- '**/details_harness|gsm8k|5_2023-10-18T02-00-22.389736.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T02-00-22.389736.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hellaswag|10_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T11:14:02.105897.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T11:14:02.105897.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T11:14:02.105897.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T02_00_22.389736
path:
- '**/details_harness|winogrande|5_2023-10-18T02-00-22.389736.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T02-00-22.389736.parquet'
- config_name: results
data_files:
- split: 2023_09_02T11_14_02.105897
path:
- results_2023-09-02T11:14:02.105897.parquet
- split: 2023_10_18T02_00_22.389736
path:
- results_2023-10-18T02-00-22.389736.parquet
- split: latest
path:
- results_2023-10-18T02-00-22.389736.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w-q_k_v_o_proj
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w-q_k_v_o_proj
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w-q_k_v_o_proj](https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w-q_k_v_o_proj) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_FINETUNE2_3w-q_k_v_o_proj",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T02:00:22.389736](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_FINETUNE2_3w-q_k_v_o_proj/blob/main/results_2023-10-18T02-00-22.389736.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3719588926174497,
"em_stderr": 0.004949726013193945,
"f1": 0.4084679110738261,
"f1_stderr": 0.004843145937750956,
"acc": 0.4480415851620389,
"acc_stderr": 0.010535274120903989
},
"harness|drop|3": {
"em": 0.3719588926174497,
"em_stderr": 0.004949726013193945,
"f1": 0.4084679110738261,
"f1_stderr": 0.004843145937750956
},
"harness|gsm8k|5": {
"acc": 0.1281273692191054,
"acc_stderr": 0.009206398549980031
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827948
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
renumics/spotlight-vikp-textbook_quality_programming-enrichment | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: topic.embedding
sequence: float32
length: 2
- name: model.embedding
sequence: float32
length: 2
- name: markdown.embedding
sequence: float32
length: 2
splits:
- name: train
num_bytes: 279600
num_examples: 11650
download_size: 389517
dataset_size: 279600
---
# Dataset Card for "spotlight-vikp-textbook_quality_programming-enrichment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mattymchen/natural-instruction-195 | ---
language:
- en
task_categories:
- text-classification
task_ids:
- sentiment-classification
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 549795
num_examples: 6500
download_size: 388442
dataset_size: 549795
---
# Dataset Card for "natural-instruction-195"
## Dataset Description
NaturalInstruction task 195.
In this task, you are given a text from tweets. Your task is to classify given tweet text into two categories: 1) positive, and 2) negative based on its content.
## Data Fields
- `text`: Tweet text.
- `label`: Sentiment of the text, either "negative" (0) or positive (1).
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
metaeval/spartqa-mchoice | ---
license: mit
---
https://github.com/HLR/SpartQA-baselines
```
@inproceedings{mirzaee-etal-2021-spartqa,
title = "{SPARTQA}: A Textual Question Answering Benchmark for Spatial Reasoning",
author = "Mirzaee, Roshanak and
Rajaby Faghihi, Hossein and
Ning, Qiang and
Kordjamshidi, Parisa",
booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
month = jun,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.naacl-main.364",
doi = "10.18653/v1/2021.naacl-main.364",
pages = "4582--4598",
}
``` |
lavita/medical-qa-datasets | ---
language:
- en
task_categories:
- question-answering
tags:
- medical
- healthcare
- clinical
dataset_info:
- config_name: all-processed
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 269589377
num_examples: 239357
download_size: 155267884
dataset_size: 269589377
- config_name: chatdoctor-icliniq
features:
- name: input
dtype: string
- name: answer_icliniq
dtype: string
- name: answer_chatgpt
dtype: string
- name: answer_chatdoctor
dtype: string
splits:
- name: test
num_bytes: 16962106
num_examples: 7321
download_size: 9373079
dataset_size: 16962106
- config_name: chatdoctor_healthcaremagic
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 126454896
num_examples: 112165
download_size: 70518147
dataset_size: 126454896
- config_name: med-qa-en-4options-source
features:
- name: meta_info
dtype: string
- name: question
dtype: string
- name: answer_idx
dtype: string
- name: answer
dtype: string
- name: options
list:
- name: key
dtype: string
- name: value
dtype: string
- name: metamap_phrases
sequence: string
splits:
- name: train
num_bytes: 15420106
num_examples: 10178
- name: test
num_bytes: 1976582
num_examples: 1273
- name: validation
num_bytes: 1925861
num_examples: 1272
download_size: 9684872
dataset_size: 19322549
- config_name: med-qa-en-5options-source
features:
- name: meta_info
dtype: string
- name: question
dtype: string
- name: answer_idx
dtype: string
- name: answer
dtype: string
- name: options
list:
- name: key
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 9765366
num_examples: 10178
- name: test
num_bytes: 1248299
num_examples: 1273
- name: validation
num_bytes: 1220927
num_examples: 1272
download_size: 6704270
dataset_size: 12234592
- config_name: medical_meadow_cord19
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1336834621
num_examples: 821007
download_size: 752855706
dataset_size: 1336834621
- config_name: medical_meadow_health_advice
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2196957
num_examples: 8676
download_size: 890725
dataset_size: 2196957
- config_name: medical_meadow_medical_flashcards
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 16453987
num_examples: 33955
download_size: 6999958
dataset_size: 16453987
- config_name: medical_meadow_mediqa
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 15690088
num_examples: 2208
download_size: 3719929
dataset_size: 15690088
- config_name: medical_meadow_medqa
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 10225018
num_examples: 10178
download_size: 5505473
dataset_size: 10225018
- config_name: medical_meadow_mmmlu
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1442124
num_examples: 3787
download_size: 685604
dataset_size: 1442124
- config_name: medical_meadow_pubmed_causal
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 846695
num_examples: 2446
download_size: 210947
dataset_size: 846695
- config_name: medical_meadow_wikidoc
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 10224074
num_examples: 10000
download_size: 5593178
dataset_size: 10224074
- config_name: medical_meadow_wikidoc_patient_information
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3262558
num_examples: 5942
download_size: 1544286
dataset_size: 3262558
- config_name: medmcqa
features:
- name: id
dtype: string
- name: question
dtype: string
- name: opa
dtype: string
- name: opb
dtype: string
- name: opc
dtype: string
- name: opd
dtype: string
- name: cop
dtype:
class_label:
names:
'0': a
'1': b
'2': c
'3': d
- name: choice_type
dtype: string
- name: exp
dtype: string
- name: subject_name
dtype: string
- name: topic_name
dtype: string
splits:
- name: train
num_bytes: 131903297
num_examples: 182822
- name: test
num_bytes: 1399350
num_examples: 6150
- name: validation
num_bytes: 2221428
num_examples: 4183
download_size: 88311484
dataset_size: 135524075
- config_name: mmmlu-anatomy
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 31810
num_examples: 134
- name: validation
num_bytes: 2879
num_examples: 13
- name: train
num_bytes: 717
num_examples: 4
download_size: 35632
dataset_size: 35406
- config_name: mmmlu-clinical-knowledge
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 60710
num_examples: 264
- name: validation
num_bytes: 6231
num_examples: 28
- name: train
num_bytes: 1026
num_examples: 4
download_size: 60329
dataset_size: 67967
- config_name: mmmlu-college-biology
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 47319
num_examples: 143
- name: validation
num_bytes: 4462
num_examples: 15
- name: train
num_bytes: 1103
num_examples: 4
download_size: 49782
dataset_size: 52884
- config_name: mmmlu-college-medicine
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 80363
num_examples: 172
- name: validation
num_bytes: 7079
num_examples: 21
- name: train
num_bytes: 1434
num_examples: 4
download_size: 63671
dataset_size: 88876
- config_name: mmmlu-medical-genetics
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 20021
num_examples: 99
- name: validation
num_bytes: 2590
num_examples: 10
- name: train
num_bytes: 854
num_examples: 4
download_size: 29043
dataset_size: 23465
- config_name: mmmlu-professional-medicine
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 214495
num_examples: 271
- name: validation
num_bytes: 23003
num_examples: 30
- name: train
num_bytes: 2531
num_examples: 4
download_size: 157219
dataset_size: 240029
- config_name: pubmed-qa
features:
- name: QUESTION
dtype: string
- name: CONTEXTS
sequence: string
- name: LABELS
sequence: string
- name: MESHES
sequence: string
- name: YEAR
dtype: string
- name: reasoning_required_pred
dtype: string
- name: reasoning_free_pred
dtype: string
- name: final_decision
dtype: string
- name: LONG_ANSWER
dtype: string
splits:
- name: train
num_bytes: 421508218
num_examples: 200000
- name: validation
num_bytes: 23762218
num_examples: 11269
download_size: 233536544
dataset_size: 445270436
- config_name: truthful-qa-generation
features:
- name: type
dtype: string
- name: category
dtype: string
- name: question
dtype: string
- name: best_answer
dtype: string
- name: correct_answers
sequence: string
- name: incorrect_answers
sequence: string
- name: source
dtype: string
splits:
- name: validation
num_bytes: 473382
num_examples: 817
download_size: 222648
dataset_size: 473382
- config_name: truthful-qa-multiple-choice
features:
- name: question
dtype: string
- name: mc1_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int32
- name: mc2_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int32
splits:
- name: validation
num_bytes: 609082
num_examples: 817
download_size: 271032
dataset_size: 609082
- config_name: usmle-self-assessment-step1
features:
- name: question
dtype: string
- name: options
struct:
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: E
dtype: string
- name: F
dtype: string
- name: G
dtype: string
- name: H
dtype: string
- name: I
dtype: string
- name: answer
dtype: string
- name: answer_idx
dtype: string
splits:
- name: test
num_bytes: 80576
num_examples: 94
download_size: 60550
dataset_size: 80576
- config_name: usmle-self-assessment-step2
features:
- name: question
dtype: string
- name: options
struct:
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: E
dtype: string
- name: F
dtype: string
- name: G
dtype: string
- name: answer
dtype: string
- name: answer_idx
dtype: string
splits:
- name: test
num_bytes: 133267
num_examples: 109
download_size: 80678
dataset_size: 133267
- config_name: usmle-self-assessment-step3
features:
- name: question
dtype: string
- name: options
struct:
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: E
dtype: string
- name: F
dtype: string
- name: G
dtype: string
- name: answer
dtype: string
- name: answer_idx
dtype: string
splits:
- name: test
num_bytes: 156286
num_examples: 122
download_size: 98163
dataset_size: 156286
configs:
- config_name: all-processed
data_files:
- split: train
path: all-processed/train-*
- config_name: chatdoctor-icliniq
data_files:
- split: test
path: chatdoctor-icliniq/test-*
- config_name: chatdoctor_healthcaremagic
data_files:
- split: train
path: chatdoctor_healthcaremagic/train-*
- config_name: med-qa-en-4options-source
data_files:
- split: train
path: med-qa-en-4options-source/train-*
- split: test
path: med-qa-en-4options-source/test-*
- split: validation
path: med-qa-en-4options-source/validation-*
- config_name: med-qa-en-5options-source
data_files:
- split: train
path: med-qa-en-5options-source/train-*
- split: test
path: med-qa-en-5options-source/test-*
- split: validation
path: med-qa-en-5options-source/validation-*
- config_name: medical_meadow_cord19
data_files:
- split: train
path: medical_meadow_cord19/train-*
- config_name: medical_meadow_health_advice
data_files:
- split: train
path: medical_meadow_health_advice/train-*
- config_name: medical_meadow_medical_flashcards
data_files:
- split: train
path: medical_meadow_medical_flashcards/train-*
- config_name: medical_meadow_mediqa
data_files:
- split: train
path: medical_meadow_mediqa/train-*
- config_name: medical_meadow_medqa
data_files:
- split: train
path: medical_meadow_medqa/train-*
- config_name: medical_meadow_mmmlu
data_files:
- split: train
path: medical_meadow_mmmlu/train-*
- config_name: medical_meadow_pubmed_causal
data_files:
- split: train
path: medical_meadow_pubmed_causal/train-*
- config_name: medical_meadow_wikidoc
data_files:
- split: train
path: medical_meadow_wikidoc/train-*
- config_name: medical_meadow_wikidoc_patient_information
data_files:
- split: train
path: medical_meadow_wikidoc_patient_information/train-*
- config_name: medmcqa
data_files:
- split: train
path: medmcqa/train-*
- split: test
path: medmcqa/test-*
- split: validation
path: medmcqa/validation-*
- config_name: mmmlu-anatomy
data_files:
- split: test
path: mmmlu-anatomy/test-*
- split: validation
path: mmmlu-anatomy/validation-*
- split: train
path: mmmlu-anatomy/train-*
- config_name: mmmlu-clinical-knowledge
data_files:
- split: test
path: mmmlu-clinical-knowledge/test-*
- split: validation
path: mmmlu-clinical-knowledge/validation-*
- split: train
path: mmmlu-clinical-knowledge/train-*
- config_name: mmmlu-college-biology
data_files:
- split: test
path: mmmlu-college-biology/test-*
- split: validation
path: mmmlu-college-biology/validation-*
- split: train
path: mmmlu-college-biology/train-*
- config_name: mmmlu-college-medicine
data_files:
- split: test
path: mmmlu-college-medicine/test-*
- split: validation
path: mmmlu-college-medicine/validation-*
- split: train
path: mmmlu-college-medicine/train-*
- config_name: mmmlu-medical-genetics
data_files:
- split: test
path: mmmlu-medical-genetics/test-*
- split: validation
path: mmmlu-medical-genetics/validation-*
- split: train
path: mmmlu-medical-genetics/train-*
- config_name: mmmlu-professional-medicine
data_files:
- split: test
path: mmmlu-professional-medicine/test-*
- split: validation
path: mmmlu-professional-medicine/validation-*
- split: train
path: mmmlu-professional-medicine/train-*
- config_name: pubmed-qa
data_files:
- split: train
path: pubmed-qa/train-*
- split: validation
path: pubmed-qa/validation-*
- config_name: truthful-qa-generation
data_files:
- split: validation
path: truthful-qa-generation/validation-*
- config_name: truthful-qa-multiple-choice
data_files:
- split: validation
path: truthful-qa-multiple-choice/validation-*
- config_name: usmle-self-assessment-step1
data_files:
- split: test
path: usmle-self-assessment-step1/test-*
- config_name: usmle-self-assessment-step2
data_files:
- split: test
path: usmle-self-assessment-step2/test-*
- config_name: usmle-self-assessment-step3
data_files:
- split: test
path: usmle-self-assessment-step3/test-*
---
* `all-processed` dataset is a concatenation of of `medical-meadow-*` and `chatdoctor_healthcaremagic` datasets
* The `Chat` `Doctor` term is replaced by the `chatbot` term in the `chatdoctor_healthcaremagic` dataset
* Similar to the literature the `medical_meadow_cord19` dataset is subsampled to 50,000 samples
* `truthful-qa-*` is a benchmark dataset for evaluating the truthfulness of models in text generation, which is used in Llama 2 paper. Within this dataset, there are 55 and 16 questions related to `Health` and `Nutrition`, respectively, making it a valuable resource for medical question-answering scenarios. |
aminlouhichi/donut5Fournissuer | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 22887975.0
num_examples: 106
- name: validation
num_bytes: 22887975.0
num_examples: 106
- name: test
num_bytes: 35690926.0
num_examples: 106
download_size: 69740850
dataset_size: 81466876.0
---
# Dataset Card for "donut5Fournissuer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-8x7b-v16.2-32k | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-mixtral-8x7b-v16.2-32k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-mixtral-8x7b-v16.2-32k](https://huggingface.co/OpenBuddy/openbuddy-mixtral-8x7b-v16.2-32k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-8x7b-v16.2-32k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T16:13:38.805323](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-8x7b-v16.2-32k/blob/main/results_2023-12-30T16-13-38.805323.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6950888574372245,\n\
\ \"acc_stderr\": 0.030251453163127155,\n \"acc_norm\": 0.7088207463824532,\n\
\ \"acc_norm_stderr\": 0.031043623179390627,\n \"mc1\": 0.40636474908200737,\n\
\ \"mc1_stderr\": 0.017193835812093907,\n \"mc2\": 0.5665232115821557,\n\
\ \"mc2_stderr\": 0.014849783912176732\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3267918088737201,\n \"acc_stderr\": 0.013706665975587338,\n\
\ \"acc_norm\": 0.3438566552901024,\n \"acc_norm_stderr\": 0.013880644570156208\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6337382991435969,\n\
\ \"acc_stderr\": 0.004807975515446488,\n \"acc_norm\": 0.817167894841665,\n\
\ \"acc_norm_stderr\": 0.0038573886135331043\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n\
\ \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.6814814814814815,\n\
\ \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882923,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882923\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775402,\n\
\ \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775402\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n\
\ \"acc_stderr\": 0.030635578972093278,\n \"acc_norm\": 0.8402777777777778,\n\
\ \"acc_norm_stderr\": 0.030635578972093278\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04975185951049946,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04975185951049946\n },\n\
\ \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n\
\ \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.030976692998534425,\n\
\ \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.030976692998534425\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6491228070175439,\n\
\ \"acc_stderr\": 0.04489539350270697,\n \"acc_norm\": 0.6491228070175439,\n\
\ \"acc_norm_stderr\": 0.04489539350270697\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.0394170763206489,\n\
\ \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.0394170763206489\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48677248677248675,\n \"acc_stderr\": 0.025742297289575142,\n \"\
acc_norm\": 0.48677248677248675,\n \"acc_norm_stderr\": 0.025742297289575142\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8516129032258064,\n \"acc_stderr\": 0.020222737554330374,\n \"\
acc_norm\": 0.8516129032258064,\n \"acc_norm_stderr\": 0.020222737554330374\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5812807881773399,\n \"acc_stderr\": 0.03471192860518468,\n \"\
acc_norm\": 0.5812807881773399,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.02931118867498311,\n\
\ \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.02931118867498311\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.024825909793343343,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.024825909793343343\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.02306043838085774,\n \
\ \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.02306043838085774\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8025210084033614,\n \"acc_stderr\": 0.02585916412205146,\n \
\ \"acc_norm\": 0.8025210084033614,\n \"acc_norm_stderr\": 0.02585916412205146\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4503311258278146,\n \"acc_stderr\": 0.040622900186837764,\n \"\
acc_norm\": 0.4503311258278146,\n \"acc_norm_stderr\": 0.040622900186837764\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8917431192660551,\n \"acc_stderr\": 0.01332134844761176,\n \"\
acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.01332134844761176\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884565,\n \
\ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884565\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.757847533632287,\n\
\ \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.757847533632287,\n\
\ \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752596,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752596\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.03343270062869621,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.03343270062869621\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719097,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719097\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.0349260647662379,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.0349260647662379\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.01872430174194164,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.01872430174194164\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.879948914431673,\n\
\ \"acc_stderr\": 0.011622736692041268,\n \"acc_norm\": 0.879948914431673,\n\
\ \"acc_norm_stderr\": 0.011622736692041268\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7774566473988439,\n \"acc_stderr\": 0.02239421566194282,\n\
\ \"acc_norm\": 0.7774566473988439,\n \"acc_norm_stderr\": 0.02239421566194282\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5329608938547487,\n\
\ \"acc_stderr\": 0.016686126653013934,\n \"acc_norm\": 0.5329608938547487,\n\
\ \"acc_norm_stderr\": 0.016686126653013934\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02355083135199509,\n\
\ \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02355083135199509\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7942122186495176,\n\
\ \"acc_stderr\": 0.022961339906764244,\n \"acc_norm\": 0.7942122186495176,\n\
\ \"acc_norm_stderr\": 0.022961339906764244\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.022021366100220204,\n\
\ \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.022021366100220204\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.549645390070922,\n \"acc_stderr\": 0.02968010556502904,\n \
\ \"acc_norm\": 0.549645390070922,\n \"acc_norm_stderr\": 0.02968010556502904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5195567144719687,\n\
\ \"acc_stderr\": 0.012760464028289299,\n \"acc_norm\": 0.5195567144719687,\n\
\ \"acc_norm_stderr\": 0.012760464028289299\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7904411764705882,\n \"acc_stderr\": 0.024723110407677065,\n\
\ \"acc_norm\": 0.7904411764705882,\n \"acc_norm_stderr\": 0.024723110407677065\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7467320261437909,\n \"acc_stderr\": 0.01759348689536683,\n \
\ \"acc_norm\": 0.7467320261437909,\n \"acc_norm_stderr\": 0.01759348689536683\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.0337997668989631,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.0337997668989631\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.02709729011807082,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.02709729011807082\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n\
\ \"mc1_stderr\": 0.017193835812093907,\n \"mc2\": 0.5665232115821557,\n\
\ \"mc2_stderr\": 0.014849783912176732\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497813\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.001312157814867419\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-mixtral-8x7b-v16.2-32k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|arc:challenge|25_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|gsm8k|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hellaswag|10_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T16-13-38.805323.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T16-13-38.805323.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- '**/details_harness|winogrande|5_2023-12-30T16-13-38.805323.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T16-13-38.805323.parquet'
- config_name: results
data_files:
- split: 2023_12_30T16_13_38.805323
path:
- results_2023-12-30T16-13-38.805323.parquet
- split: latest
path:
- results_2023-12-30T16-13-38.805323.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mixtral-8x7b-v16.2-32k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-mixtral-8x7b-v16.2-32k](https://huggingface.co/OpenBuddy/openbuddy-mixtral-8x7b-v16.2-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-8x7b-v16.2-32k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T16:13:38.805323](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-8x7b-v16.2-32k/blob/main/results_2023-12-30T16-13-38.805323.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6950888574372245,
"acc_stderr": 0.030251453163127155,
"acc_norm": 0.7088207463824532,
"acc_norm_stderr": 0.031043623179390627,
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093907,
"mc2": 0.5665232115821557,
"mc2_stderr": 0.014849783912176732
},
"harness|arc:challenge|25": {
"acc": 0.3267918088737201,
"acc_stderr": 0.013706665975587338,
"acc_norm": 0.3438566552901024,
"acc_norm_stderr": 0.013880644570156208
},
"harness|hellaswag|10": {
"acc": 0.6337382991435969,
"acc_stderr": 0.004807975515446488,
"acc_norm": 0.817167894841665,
"acc_norm_stderr": 0.0038573886135331043
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882923,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882923
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775402,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775402
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093278,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093278
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267438,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267438
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5,
"acc_stderr": 0.04975185951049946,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04975185951049946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6595744680851063,
"acc_stderr": 0.030976692998534425,
"acc_norm": 0.6595744680851063,
"acc_norm_stderr": 0.030976692998534425
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.04489539350270697,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.04489539350270697
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.0394170763206489,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.0394170763206489
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48677248677248675,
"acc_stderr": 0.025742297289575142,
"acc_norm": 0.48677248677248675,
"acc_norm_stderr": 0.025742297289575142
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8516129032258064,
"acc_stderr": 0.020222737554330374,
"acc_norm": 0.8516129032258064,
"acc_norm_stderr": 0.020222737554330374
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5812807881773399,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.5812807881773399,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.02931118867498311,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.02931118867498311
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.024825909793343343,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.024825909793343343
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7076923076923077,
"acc_stderr": 0.02306043838085774,
"acc_norm": 0.7076923076923077,
"acc_norm_stderr": 0.02306043838085774
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8025210084033614,
"acc_stderr": 0.02585916412205146,
"acc_norm": 0.8025210084033614,
"acc_norm_stderr": 0.02585916412205146
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4503311258278146,
"acc_stderr": 0.040622900186837764,
"acc_norm": 0.4503311258278146,
"acc_norm_stderr": 0.040622900186837764
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.01332134844761176,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.01332134844761176
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801588,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801588
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884565,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884565
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.757847533632287,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.757847533632287,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752596,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752596
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.03343270062869621,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.03343270062869621
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719097,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719097
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.0349260647662379,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.0349260647662379
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.01872430174194164,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.01872430174194164
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.879948914431673,
"acc_stderr": 0.011622736692041268,
"acc_norm": 0.879948914431673,
"acc_norm_stderr": 0.011622736692041268
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7774566473988439,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.7774566473988439,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5329608938547487,
"acc_stderr": 0.016686126653013934,
"acc_norm": 0.5329608938547487,
"acc_norm_stderr": 0.016686126653013934
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7942122186495176,
"acc_stderr": 0.022961339906764244,
"acc_norm": 0.7942122186495176,
"acc_norm_stderr": 0.022961339906764244
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.022021366100220204,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.022021366100220204
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.549645390070922,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.549645390070922,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5195567144719687,
"acc_stderr": 0.012760464028289299,
"acc_norm": 0.5195567144719687,
"acc_norm_stderr": 0.012760464028289299
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7904411764705882,
"acc_stderr": 0.024723110407677065,
"acc_norm": 0.7904411764705882,
"acc_norm_stderr": 0.024723110407677065
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7467320261437909,
"acc_stderr": 0.01759348689536683,
"acc_norm": 0.7467320261437909,
"acc_norm_stderr": 0.01759348689536683
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.0337997668989631,
"acc_norm": 0.87,
"acc_norm_stderr": 0.0337997668989631
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.02709729011807082,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.02709729011807082
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093907,
"mc2": 0.5665232115821557,
"mc2_stderr": 0.014849783912176732
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497813
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.001312157814867419
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
booksouls/booksum-cleaned | ---
dataset_info:
features:
- name: chapter
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 161588585.20105472
num_examples: 8145
- name: validation
num_bytes: 24977290.21094265
num_examples: 1259
- name: test
num_bytes: 24104374.588002637
num_examples: 1215
download_size: 134806852
dataset_size: 210670250.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
taesiri/beninmadrid | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: validation
num_bytes: 918214940.525
num_examples: 9105
download_size: 1407147767
dataset_size: 918214940.525
license: cc-by-4.0
task_categories:
- image-to-text
tags:
- art
pretty_name: Benin, Madrid
size_categories:
- 1K<n<10K
---
# Benin, Madrid Image Dataset
## Description
This dataset comprises images sourced from the [beninmadrid Instagram page](https://www.instagram.com/beninmadrid/) and is intended to serve as a challenging and intriguing dataset for testing visual language models and large multimodal language models. The images in this dataset are characterized by their unique artistic style and complexity, which can provide a robust test for the capabilities of modern AI models.
## Usage
This dataset is intended for research purposes, specifically the evaluation of visual and multimodal language models.
## Structure
- Each entry in the dataset is an image without any annotation or category.
## License
This dataset is made available under a [Creative Commons Attribution 4.0 International License](https://creativecommons.org/licenses/by/4.0/). |
vilm/Code-Pretrained-Instruction | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 118154566
num_examples: 78264
download_size: 51866219
dataset_size: 118154566
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
heliosprime/twitter_dataset_1713003857 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 6344
num_examples: 14
download_size: 8065
dataset_size: 6344
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713003857"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e1 | ---
pretty_name: Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1](https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T05:33:56.046720](https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e1/blob/main/results_2024-01-21T05-33-56.046720.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5886798841868525,\n\
\ \"acc_stderr\": 0.033481177434210675,\n \"acc_norm\": 0.5934525981229489,\n\
\ \"acc_norm_stderr\": 0.034167981418467,\n \"mc1\": 0.4700122399020808,\n\
\ \"mc1_stderr\": 0.017471992091697537,\n \"mc2\": 0.6312877411374193,\n\
\ \"mc2_stderr\": 0.015524870393458118\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5435153583617748,\n \"acc_stderr\": 0.014555949760496444,\n\
\ \"acc_norm\": 0.6015358361774744,\n \"acc_norm_stderr\": 0.014306946052735567\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6310495917147978,\n\
\ \"acc_stderr\": 0.004815343349305216,\n \"acc_norm\": 0.8259310894244174,\n\
\ \"acc_norm_stderr\": 0.0037839381501516165\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.029514703583981765,\n\
\ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.029514703583981765\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.037786210790920566,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.037786210790920566\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.02475747390275206,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.02475747390275206\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.026985289576552735,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.026985289576552735\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846482,\n\
\ \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846482\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7779816513761468,\n \"acc_stderr\": 0.017818849564796648,\n \"\
acc_norm\": 0.7779816513761468,\n \"acc_norm_stderr\": 0.017818849564796648\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n\
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n\
\ \"acc_stderr\": 0.01513338327898883,\n \"acc_norm\": 0.7662835249042146,\n\
\ \"acc_norm_stderr\": 0.01513338327898883\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165545,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165545\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30726256983240224,\n\
\ \"acc_stderr\": 0.015430158846469613,\n \"acc_norm\": 0.30726256983240224,\n\
\ \"acc_norm_stderr\": 0.015430158846469613\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0267874531119065,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0267874531119065\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882116,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882116\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n\
\ \"acc_stderr\": 0.012656810383983965,\n \"acc_norm\": 0.4335071707953064,\n\
\ \"acc_norm_stderr\": 0.012656810383983965\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n\
\ \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6078431372549019,\n \"acc_stderr\": 0.019751726508762637,\n \
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.019751726508762637\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4700122399020808,\n\
\ \"mc1_stderr\": 0.017471992091697537,\n \"mc2\": 0.6312877411374193,\n\
\ \"mc2_stderr\": 0.015524870393458118\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025391\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3775587566338135,\n \
\ \"acc_stderr\": 0.013353150666358539\n }\n}\n```"
repo_url: https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|arc:challenge|25_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|arc:challenge|25_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|gsm8k|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|gsm8k|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hellaswag|10_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hellaswag|10_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T05-27-51.994355.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T05-33-56.046720.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T05-33-56.046720.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- '**/details_harness|winogrande|5_2024-01-21T05-27-51.994355.parquet'
- split: 2024_01_21T05_33_56.046720
path:
- '**/details_harness|winogrande|5_2024-01-21T05-33-56.046720.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T05-33-56.046720.parquet'
- config_name: results
data_files:
- split: 2024_01_21T05_27_51.994355
path:
- results_2024-01-21T05-27-51.994355.parquet
- split: 2024_01_21T05_33_56.046720
path:
- results_2024-01-21T05-33-56.046720.parquet
- split: latest
path:
- results_2024-01-21T05-33-56.046720.parquet
---
# Dataset Card for Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1](https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T05:33:56.046720](https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e1/blob/main/results_2024-01-21T05-33-56.046720.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5886798841868525,
"acc_stderr": 0.033481177434210675,
"acc_norm": 0.5934525981229489,
"acc_norm_stderr": 0.034167981418467,
"mc1": 0.4700122399020808,
"mc1_stderr": 0.017471992091697537,
"mc2": 0.6312877411374193,
"mc2_stderr": 0.015524870393458118
},
"harness|arc:challenge|25": {
"acc": 0.5435153583617748,
"acc_stderr": 0.014555949760496444,
"acc_norm": 0.6015358361774744,
"acc_norm_stderr": 0.014306946052735567
},
"harness|hellaswag|10": {
"acc": 0.6310495917147978,
"acc_stderr": 0.004815343349305216,
"acc_norm": 0.8259310894244174,
"acc_norm_stderr": 0.0037839381501516165
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.029514703583981765,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.029514703583981765
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.037786210790920566,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.037786210790920566
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.02475747390275206,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.02475747390275206
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.026985289576552735,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.026985289576552735
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153303,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153303
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7779816513761468,
"acc_stderr": 0.017818849564796648,
"acc_norm": 0.7779816513761468,
"acc_norm_stderr": 0.017818849564796648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489294,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489294
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.01513338327898883,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.01513338327898883
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165545,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30726256983240224,
"acc_stderr": 0.015430158846469613,
"acc_norm": 0.30726256983240224,
"acc_norm_stderr": 0.015430158846469613
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0267874531119065,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0267874531119065
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776162,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776162
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882116,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882116
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236848,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983965,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983965
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003476,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003476
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.019751726508762637,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.019751726508762637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4700122399020808,
"mc1_stderr": 0.017471992091697537,
"mc2": 0.6312877411374193,
"mc2_stderr": 0.015524870393458118
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025391
},
"harness|gsm8k|5": {
"acc": 0.3775587566338135,
"acc_stderr": 0.013353150666358539
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
SHS/cancer_test_data3 | ---
dataset_info:
features:
- name: passage
dtype: string
- name: passage_token
sequence: string
splits:
- name: train
num_bytes: 16571
num_examples: 1
download_size: 10583
dataset_size: 16571
---
# Dataset Card for "cancer_test_data3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HeonWoo22/my_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1419447.0
num_examples: 63
download_size: 1418242
dataset_size: 1419447.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ray-w233/mini-platypus | ---
license: mit
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4289117
num_examples: 1000
download_size: 2281186
dataset_size: 4289117
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ayan1988/diffusion.maobi | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 162901211.0
num_examples: 319
download_size: 162850654
dataset_size: 162901211.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/silva_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of silva (Granblue Fantasy)
This is the dataset of silva (Granblue Fantasy), containing 341 images and their tags.
The core tags of this character are `long_hair, breasts, yellow_eyes, braid, ahoge, large_breasts, twin_braids, hair_between_eyes, very_long_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 341 | 384.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silva_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 341 | 263.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silva_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 754 | 504.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silva_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 341 | 359.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silva_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 754 | 642.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silva_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/silva_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | blush, pantyhose, playboy_bunny, rabbit_ears, 1girl, leotard, solo, fake_animal_ears, looking_at_viewer, bare_shoulders, detached_collar, white_background, ass, cleavage, open_mouth, rabbit_tail, simple_background, wrist_cuffs |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, belt, blush, cleavage, collarbone, crop_top, looking_at_viewer, midriff, solo, navel, simple_background, wavy_hair, armpits, smile, white_background, arm_behind_head, arms_up, black_pants, brown_eyes, closed_mouth, open_clothes, ponytail, sleeveless_shirt, upper_body |
| 2 | 15 |  |  |  |  |  | 1girl, cleavage, midriff, solo, belt, navel, looking_at_viewer, crop_top, black_jacket, long_sleeves, open_jacket, collarbone, white_shirt, ponytail, white_background, simple_background, wavy_hair, black_pants, gun, stomach |
| 3 | 47 |  |  |  |  |  | 1girl, midriff, solo, belt, cleavage, miniskirt, navel, rifle, looking_at_viewer, boots, crop_top, thighhighs, holding |
| 4 | 17 |  |  |  |  |  | cleavage, 1girl, collarbone, looking_at_viewer, solo, bare_shoulders, blue_bikini, navel, blush, sarong, smile, sun_hat, sitting |
| 5 | 5 |  |  |  |  |  | blue_bikini, blue_sky, blush, cleavage, day, looking_at_viewer, navel, smile, 1girl, bare_shoulders, beach, collarbone, outdoors, solo, thighs, cloud, side-tie_bikini_bottom, water, grey_hair, micro_bikini, ocean, open_mouth, palm_tree, sitting, standing, wavy_hair |
| 6 | 22 |  |  |  |  |  | 1girl, official_alternate_costume, bare_shoulders, looking_at_viewer, ponytail, solo, choker, cleavage, collarbone, blue_dress, blush, smile, white_background, simple_background, bracelet, wavy_hair, purple_dress, closed_mouth, thighs |
| 7 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, paizuri, solo_focus, nipples, penis, huge_breasts, collarbone, cum, mosaic_censoring, open_mouth, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | pantyhose | playboy_bunny | rabbit_ears | 1girl | leotard | solo | fake_animal_ears | looking_at_viewer | bare_shoulders | detached_collar | white_background | ass | cleavage | open_mouth | rabbit_tail | simple_background | wrist_cuffs | belt | collarbone | crop_top | midriff | navel | wavy_hair | armpits | smile | arm_behind_head | arms_up | black_pants | brown_eyes | closed_mouth | open_clothes | ponytail | sleeveless_shirt | upper_body | black_jacket | long_sleeves | open_jacket | white_shirt | gun | stomach | miniskirt | rifle | boots | thighhighs | holding | blue_bikini | sarong | sun_hat | sitting | blue_sky | day | beach | outdoors | thighs | cloud | side-tie_bikini_bottom | water | grey_hair | micro_bikini | ocean | palm_tree | standing | official_alternate_costume | choker | blue_dress | bracelet | purple_dress | 1boy | hetero | paizuri | solo_focus | nipples | penis | huge_breasts | cum | mosaic_censoring |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:----------------|:--------------|:--------|:----------|:-------|:-------------------|:--------------------|:-----------------|:------------------|:-------------------|:------|:-----------|:-------------|:--------------|:--------------------|:--------------|:-------|:-------------|:-----------|:----------|:--------|:------------|:----------|:--------|:------------------|:----------|:--------------|:-------------|:---------------|:---------------|:-----------|:-------------------|:-------------|:---------------|:---------------|:--------------|:--------------|:------|:----------|:------------|:--------|:--------|:-------------|:----------|:--------------|:---------|:----------|:----------|:-----------|:------|:--------|:-----------|:---------|:--------|:-------------------------|:--------|:------------|:---------------|:--------|:------------|:-----------|:-----------------------------|:---------|:-------------|:-----------|:---------------|:-------|:---------|:----------|:-------------|:----------|:--------|:---------------|:------|:-------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | | X | | X | | X | X | | X | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | | | | | X | | X | | X | | | X | | X | | | X | | X | X | X | X | X | X | | | | | X | | | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 47 |  |  |  |  |  | | | | | X | | X | | X | | | | | X | | | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 17 |  |  |  |  |  | X | | | | X | | X | | X | X | | | | X | | | | | | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | X | | X | | X | X | | | | X | X | | | | | X | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 6 | 22 |  |  |  |  |  | X | | | | X | | X | | X | X | | X | | X | | | X | | | X | | | | X | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | | | X | | | | | | | | | | X | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
TrainThenObtain-ai/jarvis | ---
license: openrail
---
|
arjun2183/train3k | ---
dataset_info:
features:
- name: Context
dtype: string
- name: Response
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3990979
num_examples: 3000
download_size: 2050056
dataset_size: 3990979
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mrm8488/en_es_sample_bad | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 1318
num_examples: 20
download_size: 2547
dataset_size: 1318
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zxx-silence/shiba-inu-fenda-dreambooth | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 2920788.0
num_examples: 10
download_size: 2921904
dataset_size: 2920788.0
---
# Dataset Card for "shiba-inu-fenda-dreambooth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bdsaglam/webnlg-musique-jerx-sft | ---
dataset_info:
features:
- name: text
dtype: string
- name: triplets
sequence: string
- name: source
dtype: string
splits:
- name: test
num_bytes: 2252402
num_examples: 7305
- name: dev
num_bytes: 1225852
num_examples: 4464
- name: train
num_bytes: 9782759
num_examples: 35536
download_size: 2664822
dataset_size: 13261013
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: dev
path: data/dev-*
- split: train
path: data/train-*
---
|
datahrvoje/twitter_dataset_1713045827 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 22403
num_examples: 50
download_size: 12030
dataset_size: 22403
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.