id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
open-llm-leaderboard/details_sequelbox__SharpBalance | 2023-10-09T05:51:10.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of sequelbox/SharpBalance
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sequelbox/SharpBalance](https://huggingface.co/sequelbox/SharpBalance) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sequelbox__SharpBalance\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T05:49:47.525988](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__SharpBalance/blob/main/results_2023-10-09T05-49-47.525988.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6942363573453332,\n\
\ \"acc_stderr\": 0.031096725292772836,\n \"acc_norm\": 0.6981043628934316,\n\
\ \"acc_norm_stderr\": 0.03106678602765059,\n \"mc1\": 0.4259485924112607,\n\
\ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.5904880959366052,\n\
\ \"mc2_stderr\": 0.014903036806895207\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6527303754266212,\n \"acc_stderr\": 0.013913034529620446,\n\
\ \"acc_norm\": 0.6928327645051194,\n \"acc_norm_stderr\": 0.013481034054980941\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6878111929894444,\n\
\ \"acc_stderr\": 0.004624393690966902,\n \"acc_norm\": 0.8759211312487553,\n\
\ \"acc_norm_stderr\": 0.0032899775233939097\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.035995863012470763,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.035995863012470763\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6723404255319149,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.6723404255319149,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.02555992055053101,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.02555992055053101\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8161290322580645,\n \"acc_stderr\": 0.022037217340267826,\n \"\
acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.022037217340267826\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n \"\
acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983134,\n\
\ \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983134\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8838383838383839,\n \"acc_stderr\": 0.02282888177524938,\n \"\
acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.02282888177524938\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.02306043838085774,\n \
\ \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.02306043838085774\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871934,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871934\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958792,\n \"\
acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958792\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746786,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746786\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.031545216720054725,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.031545216720054725\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.03008309871603521,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.03008309871603521\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.030833491146281235,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.030833491146281235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\
\ \"acc_stderr\": 0.017893784904018533,\n \"acc_norm\": 0.9188034188034188,\n\
\ \"acc_norm_stderr\": 0.017893784904018533\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8735632183908046,\n\
\ \"acc_stderr\": 0.01188448890589555,\n \"acc_norm\": 0.8735632183908046,\n\
\ \"acc_norm_stderr\": 0.01188448890589555\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7976878612716763,\n \"acc_stderr\": 0.02162807738019612,\n\
\ \"acc_norm\": 0.7976878612716763,\n \"acc_norm_stderr\": 0.02162807738019612\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.529608938547486,\n\
\ \"acc_stderr\": 0.01669315492738355,\n \"acc_norm\": 0.529608938547486,\n\
\ \"acc_norm_stderr\": 0.01669315492738355\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972949,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972949\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7909967845659164,\n\
\ \"acc_stderr\": 0.02309314039837422,\n \"acc_norm\": 0.7909967845659164,\n\
\ \"acc_norm_stderr\": 0.02309314039837422\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.0216138093952248,\n\
\ \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.0216138093952248\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5425531914893617,\n \"acc_stderr\": 0.029719281272236834,\n \
\ \"acc_norm\": 0.5425531914893617,\n \"acc_norm_stderr\": 0.029719281272236834\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.546284224250326,\n\
\ \"acc_stderr\": 0.012715404841277752,\n \"acc_norm\": 0.546284224250326,\n\
\ \"acc_norm_stderr\": 0.012715404841277752\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.75,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.02560737598657916,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.02560737598657916\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166327,\n\
\ \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166327\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n\
\ \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n\
\ \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n\
\ \"acc_stderr\": 0.0266405825391332,\n \"acc_norm\": 0.8596491228070176,\n\
\ \"acc_norm_stderr\": 0.0266405825391332\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.4259485924112607,\n \"mc1_stderr\": 0.01731047190407654,\n\
\ \"mc2\": 0.5904880959366052,\n \"mc2_stderr\": 0.014903036806895207\n\
\ }\n}\n```"
repo_url: https://huggingface.co/sequelbox/SharpBalance
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|arc:challenge|25_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hellaswag|10_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T05-49-47.525988.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T05-49-47.525988.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T05-49-47.525988.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T05-49-47.525988.parquet'
- config_name: results
data_files:
- split: 2023_10_09T05_49_47.525988
path:
- results_2023-10-09T05-49-47.525988.parquet
- split: latest
path:
- results_2023-10-09T05-49-47.525988.parquet
---
# Dataset Card for Evaluation run of sequelbox/SharpBalance
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/sequelbox/SharpBalance
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [sequelbox/SharpBalance](https://huggingface.co/sequelbox/SharpBalance) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sequelbox__SharpBalance",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T05:49:47.525988](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__SharpBalance/blob/main/results_2023-10-09T05-49-47.525988.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6942363573453332,
"acc_stderr": 0.031096725292772836,
"acc_norm": 0.6981043628934316,
"acc_norm_stderr": 0.03106678602765059,
"mc1": 0.4259485924112607,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.5904880959366052,
"mc2_stderr": 0.014903036806895207
},
"harness|arc:challenge|25": {
"acc": 0.6527303754266212,
"acc_stderr": 0.013913034529620446,
"acc_norm": 0.6928327645051194,
"acc_norm_stderr": 0.013481034054980941
},
"harness|hellaswag|10": {
"acc": 0.6878111929894444,
"acc_stderr": 0.004624393690966902,
"acc_norm": 0.8759211312487553,
"acc_norm_stderr": 0.0032899775233939097
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059006,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059006
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.035995863012470763,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.035995863012470763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6723404255319149,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.6723404255319149,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.02555992055053101,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.02555992055053101
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.022037217340267826,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.022037217340267826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.029311188674983134,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.029311188674983134
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.02282888177524938,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.02282888177524938
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7076923076923077,
"acc_stderr": 0.02306043838085774,
"acc_norm": 0.7076923076923077,
"acc_norm_stderr": 0.02306043838085774
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871934,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871934
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958792,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958792
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746786,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746786
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.031545216720054725,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.031545216720054725
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.03008309871603521,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.03008309871603521
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.030833491146281235,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.030833491146281235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.017893784904018533,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.017893784904018533
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8735632183908046,
"acc_stderr": 0.01188448890589555,
"acc_norm": 0.8735632183908046,
"acc_norm_stderr": 0.01188448890589555
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7976878612716763,
"acc_stderr": 0.02162807738019612,
"acc_norm": 0.7976878612716763,
"acc_norm_stderr": 0.02162807738019612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.529608938547486,
"acc_stderr": 0.01669315492738355,
"acc_norm": 0.529608938547486,
"acc_norm_stderr": 0.01669315492738355
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972949,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972949
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7909967845659164,
"acc_stderr": 0.02309314039837422,
"acc_norm": 0.7909967845659164,
"acc_norm_stderr": 0.02309314039837422
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.0216138093952248,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.0216138093952248
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5425531914893617,
"acc_stderr": 0.029719281272236834,
"acc_norm": 0.5425531914893617,
"acc_norm_stderr": 0.029719281272236834
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.546284224250326,
"acc_stderr": 0.012715404841277752,
"acc_norm": 0.546284224250326,
"acc_norm_stderr": 0.012715404841277752
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.75,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.75,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4259485924112607,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.5904880959366052,
"mc2_stderr": 0.014903036806895207
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ap07/input_dataset | 2023-10-09T05:56:32.000Z | [
"region:us"
] | ap07 | null | null | null | 0 | 0 | Entry not found |
feedexpdition/FinancialTickets | 2023-10-09T06:11:53.000Z | [
"license:mit",
"region:us"
] | feedexpdition | null | null | null | 0 | 0 | ---
license: mit
---
|
substratusai/the-stack-yaml-k8s | 2023-10-10T05:30:49.000Z | [
"task_categories:text-generation",
"language_creators:crowdsourced",
"language_creators:expert-generated",
"multilinguality:multilingual",
"size_categories:unknown",
"language:code",
"license:other",
"arxiv:2211.15533",
"arxiv:2107.03374",
"arxiv:2207.14157",
"region:us"
] | substratusai | null | null | null | 0 | 0 | ---
annotations_creators: []
language_creators:
- crowdsourced
- expert-generated
language:
- code
license:
- other
multilinguality:
- multilingual
pretty_name: The-Stack
size_categories:
- unknown
source_datasets: []
task_categories:
- text-generation
task_ids: []
extra_gated_prompt: |-
## Terms of Use for The Stack
The Stack dataset is a collection of source code in over 300 programming languages. We ask that you read and acknowledge the following points before using the dataset:
1. The Stack is a collection of source code from repositories with various licenses. Any use of all or part of the code gathered in The Stack must abide by the terms of the original licenses, including attribution clauses when relevant. We facilitate this by providing provenance information for each data point.
2. The Stack is regularly updated to enact validated data removal requests. By clicking on "Access repository", you agree to update your own version of The Stack to the most recent usable version specified by the maintainers in [the following thread](https://huggingface.co/datasets/bigcode/the-stack/discussions/7). If you have questions about dataset versions and allowed uses, please also ask them in the dataset’s [community discussions](https://huggingface.co/datasets/bigcode/the-stack/discussions/new). We will also notify users via email when the latest usable version changes.
3. To host, share, or otherwise provide access to The Stack dataset, you must include [these Terms of Use](https://huggingface.co/datasets/bigcode/the-stack#terms-of-use-for-the-stack) and require users to agree to it.
By clicking on "Access repository" below, you accept that your contact information (email address and username) can be shared with the dataset maintainers as well.
extra_gated_fields:
Email: text
I have read the License and agree with its terms: checkbox
dataset_info:
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: ext
dtype: string
- name: lang
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_repo_head_hexsha
dtype: string
- name: max_stars_repo_licenses
sequence: string
- name: max_stars_count
dtype: int64
- name: max_stars_repo_stars_event_min_datetime
dtype: string
- name: max_stars_repo_stars_event_max_datetime
dtype: string
- name: max_issues_repo_path
dtype: string
- name: max_issues_repo_name
dtype: string
- name: max_issues_repo_head_hexsha
dtype: string
- name: max_issues_repo_licenses
sequence: string
- name: max_issues_count
dtype: int64
- name: max_issues_repo_issues_event_min_datetime
dtype: string
- name: max_issues_repo_issues_event_max_datetime
dtype: string
- name: max_forks_repo_path
dtype: string
- name: max_forks_repo_name
dtype: string
- name: max_forks_repo_head_hexsha
dtype: string
- name: max_forks_repo_licenses
sequence: string
- name: max_forks_count
dtype: int64
- name: max_forks_repo_forks_event_min_datetime
dtype: string
- name: max_forks_repo_forks_event_max_datetime
dtype: string
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
splits:
- name: train
num_bytes: 2056665435.7311056
num_examples: 276520
download_size: 312473618
dataset_size: 2056665435.7311056
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for The Stack YAML K8s
This dataset is a subset of The Stack dataset data/yaml. The YAML files were
parsed and filtered out all valid K8s YAML files which is what this data is about.
The dataset contains 276520 valid K8s YAML files. The dataset was created by running
the [the-stack-yaml-k8s.ipynb](https://github.com/substratusai/the-stack-yaml-k8s/blob/main/the-stack-k8s-yaml.ipynb)
Notebook on K8s using [substratus.ai](https://substratus.ai)
Source code used to generate dataset: https://github.com/substratusai/the-stack-yaml-k8s
Need some help? Questions? Join our Discord server: <a href="https://discord.gg/JeXhcmjZVm"><img alt="discord-invite" src="https://dcbadge.vercel.app/api/server/JeXhcmjZVm?style=flat"></a>
### How to use it
```python
from datasets import load_dataset
ds = load_dataset("substratusai/the-stack-yaml-k8s", split="train")
ds[0]["content"]
```
## Original The Stack Dataset Description
- **Homepage:** https://www.bigcode-project.org/
- **Repository:** https://github.com/bigcode-project
- **Paper:** https://arxiv.org/abs/2211.15533
- **Leaderboard:** N/A
- **Point of Contact:** contact@bigcode-project.org
## Dataset Structure
### Data Instances
Each data instance corresponds to one file. The content of the file is in the `content` feature, and other features (`repository_name`, `licenses`, etc.) provide some metadata. Note that a given file can appear in several different repositories that satisfy our safe-license criterion. If that is the case, only the first – in alphabetical order -- of these repositories is shown for simplicity.
### Data Fields
- `content` (string): the content of the file.
- `size` (integer): size of the uncompressed file.
- `lang` (string): the programming language.
- `ext` (string): file extension
- `avg_line_length` (float): the average line-length of the file.
- `max_line_length` (integer): the maximum line-length of the file.
- `alphanum_fraction` (float): the fraction of characters in the file that are alphabetical or numerical characters.
- `hexsha` (string): unique git hash of file
- `max_{stars|forks|issues}_repo_path` (string): path to file in repo containing this file with maximum number of `{stars|forks|issues}`
- `max_{stars|forks|issues}_repo_name` (string): name of repo containing this file with maximum number of `{stars|forks|issues}`
- `max_{stars|forks|issues}_repo_head_hexsha` (string): hexsha of repository head
- `max_{stars|forks|issues}_repo_licenses` (string): licenses in repository
- `max_{stars|forks|issues}_count` (integer): number of `{stars|forks|issues}` in repository
- `max_{stars|forks|issues}_repo_{stars|forks|issues}_min_datetime` (string): first timestamp of a `{stars|forks|issues}` event
- `max_{stars|forks|issues}_repo_{stars|forks|issues}_max_datetime` (string): last timestamp of a `{stars|forks|issues}` event
### Data Splits
The dataset has no splits and all data is loaded as train split by default. If you want to setup a custom train-test split beware that dataset contains a lot of near-duplicates which can cause leakage into the test split.
## Dataset Creation
### Curation Rationale
One of the challenges faced by researchers working on code LLMs is the lack of openness and transparency around the development of these systems. Most prior works described the high-level data collection process but did not release the training data. It is therefore difficult for other researchers to fully reproduce these models and understand what kind of pre-training data leads to high-performing code LLMs. By releasing an open large-scale code dataset we hope to make training of code LLMs more reproducible.
### Source Data
#### Initial Data Collection and Normalization
220.92M active GitHub repository names were collected from the event archives published between January 1st, 2015 and March 31st, 2022 on [GHArchive](https://gharchive.org/). Only 137.36M of these repositories were public and accessible on GitHub – others were not accessible as they had been deleted by their owners. 51.76B files were downloaded from the public repositories on GitHub between November 2021 and June 2022. 5.28B files were unique. The uncompressed size of all stored files is 92.36TB.
The list of programming language extensions is taken from this [list](https://gist.github.com/ppisarczyk/43962d06686722d26d176fad46879d41) (also provided in Appendix C of the paper).
Near-deduplication was implemented in the pre-processing pipeline on top of exact deduplication. To find near-duplicates, MinHash with 256 permutations of all documents was computed in linear time. Locality Sensitive Hashing was used to find the clusters of duplicates. Jaccard Similarities were computed inside these clusters to remove any false positives and with a similarity threshold of 0.85. Roughly 40% of permissively licensed files were (near-)duplicates. See section 3 of the paper for further details.
The following are not stored:
- Files that cannot contribute to training code: binary, empty, could not be decoded
- Files larger than 1MB
- The excluded file extensions are listed in Appendix B of the paper.
##### License detection
Permissive licenses have minimal restrictions on how the software can be copied, modified, and redistributed. The full list of licenses can be found [here](https://huggingface.co/datasets/bigcode/the-stack-dedup/blob/main/licenses.json).
GHArchive contained the license information for approximately 12% of the collected repositories. For the remaining repositories, [go-license-detector](https://github.com/src-d/go-license-detector) was run to detect the most likely SPDX license identifier. The detector did not detect a license for ~81% of the repositories, in which case the repository was excluded from the dataset.
A file was included in the safe license dataset if at least one of the repositories containing the file had a permissive license.
#### Who are the source language producers?
The source (code) language producers are users of GitHub that created unique repository names between January 1st, 2015, and March 31st, 2022.
### Personal and Sensitive Information
The released dataset may contain sensitive information such as emails, IP addresses, and API/ssh keys that have previously been published to public repositories on GitHub. Deduplication has helped to reduce the amount of sensitive data that may exist. In the event that the dataset contains personal information, researchers should only use public, non-personal information in support of conducting and publishing their [open-access](https://en.wikipedia.org/wiki/Open_access) research. Personal information should not be used for spamming purposes, including sending unsolicited emails or selling of personal information. Complaints, removal requests, and "do not contact" requests can be sent to contact@bigcode-project.org.
The PII pipeline for this dataset is still a work in progress (see this [issue](https://github.com/bigcode-project/admin/issues/9) for updates). Researchers that wish to contribute to the anonymization pipeline of the project can apply to join [here](https://www.bigcode-project.org/docs/about/join/). Developers with source code in the dataset can request to have it removed [here](https://www.bigcode-project.org/docs/about/ip/) (proof of code contribution is required).
### Opting out of The Stack
We are giving developers the ability to have their code removed from the dataset upon request. The process for submitting and enacting removal requests will keep evolving throughout the project as we receive feedback and build up more data governance tools.
You can check if your code is in The Stack with the following ["Am I In The Stack?" Space](https://huggingface.co/spaces/bigcode/in-the-stack). If you'd like to have your data removed from the dataset follow the [instructions on GitHub](https://github.com/bigcode-project/opt-out-v2).
## Considerations for Using the Data
### Social Impact of Dataset
The Stack is an output of the BigCode Project. BigCode aims to be responsible by design and by default. The project is conducted in the spirit of Open Science, focused on the responsible development of LLMs for code.
With the release of The Stack, we aim to increase access, reproducibility, and transparency of code LLMs in the research community. Work to de-risk and improve on the implementation of ethical best practices of code LLMs is conducted in various BigCode working groups. The Legal, Ethics, and Governance working group has explored topics such as licensing (including copyleft and the intended use of permissively licensed code), attribution of generated code to original code, rights to restrict processing, the inclusion of Personally Identifiable Information (PII), and risks of malicious code, among other topics. This work is ongoing as of October 25th, 2022.
We expect code LLMs to enable people from diverse backgrounds to write higher quality code and develop low-code applications. Mission-critical software could become easier to maintain as professional developers are guided by code-generating systems on how to write more robust and efficient code. While the social impact is intended to be positive, the increased accessibility of code LLMs comes with certain risks such as over-reliance on the generated code and long-term effects on the software development job market.
A broader impact analysis relating to Code LLMs can be found in section 7 of this [paper](https://arxiv.org/abs/2107.03374). An in-depth risk assessments for Code LLMs can be found in section 4 of this [paper](https://arxiv.org/abs/2207.14157).
### Discussion of Biases
The code collected from GitHub does not contain demographic information or proxy information about the demographics. However, it is not without risks,
as the comments within the code may contain harmful or offensive language, which could be learned by the models.
Widely adopted programming languages like C and Javascript are overrepresented compared to niche programming languages like Julia and Scala. Some programming languages such as SQL, Batchfile, TypeScript are less likely to be permissively licensed (4% vs the average 10%). This may result in a biased representation of those languages. Permissively licensed files also tend to be longer.
Roughly 40 natural languages are present in docstrings and comments with English being the most prevalent. In python files, it makes up ~96% of the dataset.
For further information on data analysis of the Stack, see this [repo](https://github.com/bigcode-project/bigcode-analysis).
### Other Known Limitations
One of the current limitations of The Stack is that scraped HTML for websites may not be compliant with Web Content Accessibility Guidelines ([WCAG](https://www.w3.org/WAI/standards-guidelines/wcag/)). This could have an impact on HTML-generated code that may introduce web accessibility issues.
The training dataset could contain malicious code and/or the model could be used to generate malware or ransomware.
To the best of our knowledge, all files contained in the dataset are licensed with one of the permissive licenses (see list in [Licensing information](#licensing-information)). The accuracy of license attribution is limited by the accuracy of GHArchive and go-license-detector. Any mistakes should be reported to BigCode Project for review and follow-up as needed.
## Additional Information
### Dataset Curators
1. Harm de Vries, ServiceNow Research, harm.devries@servicenow.com
2. Leandro von Werra, Hugging Face, leandro@huggingface.co
### Licensing Information
The Stack is a collection of source code from repositories with various licenses. Any use of all or part of the code gathered in The Stack must abide by the terms of the original licenses, including attribution clauses when relevant. We facilitate this by providing provenance information for each data point.
The list of [SPDX license identifiers](https://spdx.org/licenses/) included in the dataset can be found [here](https://huggingface.co/datasets/bigcode/the-stack/blob/main/licenses.json).
### Citation Information
```
@article{Kocetkov2022TheStack,
title={The Stack: 3 TB of permissively licensed source code},
author={Kocetkov, Denis and Li, Raymond and Ben Allal, Loubna and Li, Jia and Mou,Chenghao and Muñoz Ferrandis, Carlos and Jernite, Yacine and Mitchell, Margaret and Hughes, Sean and Wolf, Thomas and Bahdanau, Dzmitry and von Werra, Leandro and de Vries, Harm},
journal={Preprint},
year={2022}
}
```
## Terms of Use for The Stack
The Stack dataset is a collection of source code in over 300 programming languages. We ask that you read and acknowledge the following points before using the dataset:
1. The Stack is a collection of source code from repositories with various licenses. Any use of all or part of the code gathered in The Stack must abide by the terms of the original licenses, including attribution clauses when relevant. We facilitate this by providing provenance information for each data point.
2. The Stack is regularly updated to enact validated data removal requests. By clicking on "Access repository", you agree to update your own version of The Stack to the most recent usable version specified by the maintainers in [the following thread](https://huggingface.co/datasets/bigcode/the-stack/discussions/7). If you have questions about dataset versions and allowed uses, please also ask them in the dataset’s [community discussions](https://huggingface.co/datasets/bigcode/the-stack/discussions/new). We will also notify users via email when the latest usable version changes.
3. To host, share, or otherwise provide access to The Stack dataset, you must include these Terms of Use and require users to agree to it.
|
chunpingvi/tinystories | 2023-10-09T06:09:19.000Z | [
"region:us"
] | chunpingvi | null | null | null | 0 | 0 | Entry not found |
darcy01/autotrain-data-hanzbydarcycao | 2023-10-09T06:13:45.000Z | [
"task_categories:translation",
"language:zh",
"language:en",
"region:us"
] | darcy01 | null | null | null | 0 | 0 | ---
language:
- zh
- en
task_categories:
- translation
---
# AutoTrain Dataset for project: hanzbydarcycao
## Dataset Description
This dataset has been automatically processed by AutoTrain for project hanzbydarcycao.
### Languages
The BCP-47 code for the dataset's language is zh2en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"source": "sarashi",
"target": "sarashi"
},
{
"source": "Dojo",
"target": "Dojo"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"source": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 98 |
| valid | 25 |
|
miojizzy/genshin_artifact_recognize_datasets | 2023-10-09T10:50:33.000Z | [
"region:us"
] | miojizzy | Monster Hunter Rise images and labels. | null | null | 0 | 0 | Entry not found |
ChristophSchuhmann/OpenClip-B32-KNN-Captioner | 2023-10-09T06:16:49.000Z | [
"license:apache-2.0",
"region:us"
] | ChristophSchuhmann | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
andre156/imagenette_segmented_test | 2023-10-09T06:49:54.000Z | [
"task_categories:image-segmentation",
"task_categories:image-classification",
"size_categories:n<1K",
"language:en",
"license:mit",
"region:us"
] | andre156 | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- image-segmentation
- image-classification
language:
- en
size_categories:
- n<1K
--- |
Falah/flowers_seed_prompts | 2023-10-09T06:38:59.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 89719
num_examples: 1000
download_size: 2001
dataset_size: 89719
---
# Dataset Card for "flowers_seed_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mychen76/ShareGPT_V3_unfiltered_cleaned_small_9k | 2023-10-09T06:56:38.000Z | [
"region:us"
] | mychen76 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: markdown
struct:
- name: answer
dtype: string
- name: index
dtype: int64
- name: type
dtype: string
- name: text
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 57188795.51333581
num_examples: 8473
- name: test
num_bytes: 6358060.35330607
num_examples: 942
- name: valid
num_bytes: 641205.6619576185
num_examples: 95
download_size: 28307098
dataset_size: 64188061.5285995
---
# Dataset Card for "ShareGPT_V3_unfiltered_cleaned_small_9k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B | 2023-10-09T06:56:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of krevas/LDCC-Instruct-Llama-2-ko-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [krevas/LDCC-Instruct-Llama-2-ko-13B](https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T06:55:19.126017](https://huggingface.co/datasets/open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B/blob/main/results_2023-10-09T06-55-19.126017.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5140887884293746,\n\
\ \"acc_stderr\": 0.034831195333324204,\n \"acc_norm\": 0.5180581384469735,\n\
\ \"acc_norm_stderr\": 0.03481277047428223,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.37999611805412853,\n\
\ \"mc2_stderr\": 0.013428724763055466\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5392491467576792,\n \"acc_stderr\": 0.014566303676636588,\n\
\ \"acc_norm\": 0.5674061433447098,\n \"acc_norm_stderr\": 0.014478005694182526\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6096395140410277,\n\
\ \"acc_stderr\": 0.004868341056566223,\n \"acc_norm\": 0.8156741684923322,\n\
\ \"acc_norm_stderr\": 0.0038695723555438196\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714506,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714506\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873506,\n \"\
acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873506\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5935483870967742,\n\
\ \"acc_stderr\": 0.027941727346256304,\n \"acc_norm\": 0.5935483870967742,\n\
\ \"acc_norm_stderr\": 0.027941727346256304\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6262626262626263,\n \"acc_stderr\": 0.034468977386593325,\n \"\
acc_norm\": 0.6262626262626263,\n \"acc_norm_stderr\": 0.034468977386593325\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4461538461538462,\n \"acc_stderr\": 0.02520357177302833,\n \
\ \"acc_norm\": 0.4461538461538462,\n \"acc_norm_stderr\": 0.02520357177302833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.0324371805513741,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.0324371805513741\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6678899082568808,\n \"acc_stderr\": 0.02019268298542333,\n \"\
acc_norm\": 0.6678899082568808,\n \"acc_norm_stderr\": 0.02019268298542333\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.032568505702936484,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.032568505702936484\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236436,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236436\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6919831223628692,\n \"acc_stderr\": 0.0300523893356057,\n \
\ \"acc_norm\": 0.6919831223628692,\n \"acc_norm_stderr\": 0.0300523893356057\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041018,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041018\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978813,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978813\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7113665389527458,\n\
\ \"acc_stderr\": 0.016203792703197776,\n \"acc_norm\": 0.7113665389527458,\n\
\ \"acc_norm_stderr\": 0.016203792703197776\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5520231213872833,\n \"acc_stderr\": 0.02677299065336182,\n\
\ \"acc_norm\": 0.5520231213872833,\n \"acc_norm_stderr\": 0.02677299065336182\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.014465893829859924,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.014465893829859924\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02855582751652878,\n\
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02855582751652878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192703,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192703\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380157,\n\
\ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380157\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4106910039113429,\n\
\ \"acc_stderr\": 0.012564871542534353,\n \"acc_norm\": 0.4106910039113429,\n\
\ \"acc_norm_stderr\": 0.012564871542534353\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.0302114796091216,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.0302114796091216\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.553921568627451,\n \"acc_stderr\": 0.020109864547181357,\n \
\ \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.020109864547181357\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5346938775510204,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.5346938775510204,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.03220024104534204,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.03220024104534204\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.37999611805412853,\n\
\ \"mc2_stderr\": 0.013428724763055466\n }\n}\n```"
repo_url: https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|arc:challenge|25_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hellaswag|10_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T06-55-19.126017.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T06-55-19.126017.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T06-55-19.126017.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T06-55-19.126017.parquet'
- config_name: results
data_files:
- split: 2023_10_09T06_55_19.126017
path:
- results_2023-10-09T06-55-19.126017.parquet
- split: latest
path:
- results_2023-10-09T06-55-19.126017.parquet
---
# Dataset Card for Evaluation run of krevas/LDCC-Instruct-Llama-2-ko-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [krevas/LDCC-Instruct-Llama-2-ko-13B](https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T06:55:19.126017](https://huggingface.co/datasets/open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B/blob/main/results_2023-10-09T06-55-19.126017.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5140887884293746,
"acc_stderr": 0.034831195333324204,
"acc_norm": 0.5180581384469735,
"acc_norm_stderr": 0.03481277047428223,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.01539211880501503,
"mc2": 0.37999611805412853,
"mc2_stderr": 0.013428724763055466
},
"harness|arc:challenge|25": {
"acc": 0.5392491467576792,
"acc_stderr": 0.014566303676636588,
"acc_norm": 0.5674061433447098,
"acc_norm_stderr": 0.014478005694182526
},
"harness|hellaswag|10": {
"acc": 0.6096395140410277,
"acc_stderr": 0.004868341056566223,
"acc_norm": 0.8156741684923322,
"acc_norm_stderr": 0.0038695723555438196
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5358490566037736,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.5358490566037736,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714506,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714506
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873506,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873506
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5935483870967742,
"acc_stderr": 0.027941727346256304,
"acc_norm": 0.5935483870967742,
"acc_norm_stderr": 0.027941727346256304
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512567,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512567
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6262626262626263,
"acc_stderr": 0.034468977386593325,
"acc_norm": 0.6262626262626263,
"acc_norm_stderr": 0.034468977386593325
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4461538461538462,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.4461538461538462,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.0324371805513741,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.0324371805513741
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6678899082568808,
"acc_stderr": 0.02019268298542333,
"acc_norm": 0.6678899082568808,
"acc_norm_stderr": 0.02019268298542333
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.032568505702936484,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.032568505702936484
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236436,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236436
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6919831223628692,
"acc_stderr": 0.0300523893356057,
"acc_norm": 0.6919831223628692,
"acc_norm_stderr": 0.0300523893356057
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978813,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978813
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7113665389527458,
"acc_stderr": 0.016203792703197776,
"acc_norm": 0.7113665389527458,
"acc_norm_stderr": 0.016203792703197776
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5520231213872833,
"acc_stderr": 0.02677299065336182,
"acc_norm": 0.5520231213872833,
"acc_norm_stderr": 0.02677299065336182
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859924,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859924
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.02855582751652878,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.02855582751652878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192703,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192703
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380157,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380157
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4106910039113429,
"acc_stderr": 0.012564871542534353,
"acc_norm": 0.4106910039113429,
"acc_norm_stderr": 0.012564871542534353
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.0302114796091216,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.0302114796091216
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.020109864547181357,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.020109864547181357
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5346938775510204,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.5346938775510204,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.03220024104534204,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.03220024104534204
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.01539211880501503,
"mc2": 0.37999611805412853,
"mc2_stderr": 0.013428724763055466
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ChanHE/score_112_text | 2023-10-09T07:11:11.000Z | [
"region:us"
] | ChanHE | null | null | null | 0 | 0 | Entry not found |
johannes-garstenauer/embeddings_from_distilbert_masking_heaps_and_eval_part0 | 2023-10-09T07:16:29.000Z | [
"region:us"
] | johannes-garstenauer | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: struct
dtype: string
- name: label
dtype: int64
- name: pred
dtype: int64
- name: cls_layer_6
sequence: float32
- name: cls_layer_5
sequence: float32
- name: cls_layer_4
sequence: float32
splits:
- name: train
num_bytes: 1282993344
num_examples: 134592
download_size: 1493342036
dataset_size: 1282993344
---
# Dataset Card for "embeddings_from_distilbert_masking_heaps_and_eval_part0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdiOO7/VectorDatabase | 2023-10-09T07:22:02.000Z | [
"task_categories:text-classification",
"size_categories:n<1K",
"language:en",
"license:apache-2.0",
"region:us"
] | AdiOO7 | null | null | null | 0 | 0 | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
size_categories:
- n<1K
--- |
Falah/night_time_prompts | 2023-10-09T07:18:42.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 706
num_examples: 5
download_size: 1559
dataset_size: 706
---
# Dataset Card for "night_time_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
baebee/chatgpt-custom_inst | 2023-10-09T19:16:48.000Z | [
"task_categories:summarization",
"task_categories:question-answering",
"task_categories:conversational",
"size_categories:n<1K",
"language:en",
"language:tl",
"license:mit",
"region:us"
] | baebee | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- summarization
- question-answering
- conversational
language:
- en
- tl
size_categories:
- n<1K
---
# Languages: English, Tagalog
## Collection Process:
- Dialogs generated by instructing ChatGPT to respond concisely
- Responses edited by Nuph researchers for naturalness
- Bilingual exchanges added for diversity
## Intended Use:
- Train conversational agents
- Research in straightforward dialog
# Limitations:
- Small scale (300 rows)
- Biased toward English
- Limited to text conversations
# Ethics and Privacy:
- No personal or offensive content
- ChatGPT instructed to avoid unethical responses
- Data anonymized - no personally identifiable information |
giovanni92/MailFunc | 2023-10-09T07:35:56.000Z | [
"license:mit",
"region:us"
] | giovanni92 | null | null | null | 0 | 0 | ---
license: mit
---
|
OdiaGenAI/roleplay_english | 2023-10-10T06:51:20.000Z | [
"task_categories:question-answering",
"task_categories:conversational",
"size_categories:1K<n<10K",
"language:en",
"region:us"
] | OdiaGenAI | null | null | null | 1 | 0 | ---
task_categories:
- question-answering
- conversational
language:
- en
size_categories:
- 1K<n<10K
--- |
Falah/kids_coloring_book_prompts | 2023-10-09T08:09:10.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 287035
num_examples: 3000
download_size: 4140
dataset_size: 287035
---
# Dataset Card for "kids_coloring_book_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
debthedev/Benglish_LLM_dataset | 2023-10-09T08:19:12.000Z | [
"region:us"
] | debthedev | null | null | null | 0 | 0 | Entry not found |
chiayewken/m3exam | 2023-10-09T08:38:11.000Z | [
"arxiv:2306.05179",
"region:us"
] | chiayewken | null | null | null | 0 | 0 | ---
dataset_info:
- config_name: afrikaans
features:
- name: question_text
dtype: string
- name: background
dtype: string
- name: answer_text
dtype: string
- name: options
sequence: string
- name: language
dtype: string
- name: level
dtype: string
- name: subject
dtype: string
- name: subject_category
dtype: string
splits:
- name: dev
num_bytes: 8860
num_examples: 25
- name: test
num_bytes: 194333
num_examples: 258
download_size: 71295
dataset_size: 203193
- config_name: chinese
features:
- name: question_text
dtype: string
- name: background
dtype: string
- name: answer_text
dtype: string
- name: options
sequence: string
- name: language
dtype: string
- name: level
dtype: string
- name: subject
dtype: string
- name: subject_category
dtype: string
splits:
- name: dev
num_bytes: 25055
num_examples: 29
- name: test
num_bytes: 485093
num_examples: 682
download_size: 289255
dataset_size: 510148
- config_name: english
features:
- name: question_text
dtype: string
- name: background
dtype: string
- name: answer_text
dtype: string
- name: options
sequence: string
- name: language
dtype: string
- name: level
dtype: string
- name: subject
dtype: string
- name: subject_category
dtype: string
splits:
- name: dev
num_bytes: 12792
num_examples: 32
- name: test
num_bytes: 2573796
num_examples: 1911
download_size: 697219
dataset_size: 2586588
- config_name: italian
features:
- name: question_text
dtype: string
- name: background
dtype: string
- name: answer_text
dtype: string
- name: options
sequence: string
- name: language
dtype: string
- name: level
dtype: string
- name: subject
dtype: string
- name: subject_category
dtype: string
splits:
- name: dev
num_bytes: 5834
num_examples: 18
- name: test
num_bytes: 2397963
num_examples: 811
download_size: 326671
dataset_size: 2403797
- config_name: javanese
features:
- name: question_text
dtype: string
- name: background
dtype: string
- name: answer_text
dtype: string
- name: options
sequence: string
- name: language
dtype: string
- name: level
dtype: string
- name: subject
dtype: string
- name: subject_category
dtype: string
splits:
- name: dev
num_bytes: 1425
num_examples: 6
- name: test
num_bytes: 187280
num_examples: 371
download_size: 84085
dataset_size: 188705
- config_name: portuguese
features:
- name: question_text
dtype: string
- name: background
dtype: string
- name: answer_text
dtype: string
- name: options
sequence: string
- name: language
dtype: string
- name: level
dtype: string
- name: subject
dtype: string
- name: subject_category
dtype: string
splits:
- name: dev
num_bytes: 20979
num_examples: 24
- name: test
num_bytes: 941655
num_examples: 889
download_size: 614816
dataset_size: 962634
- config_name: swahili
features:
- name: question_text
dtype: string
- name: background
dtype: string
- name: answer_text
dtype: string
- name: options
sequence: string
- name: language
dtype: string
- name: level
dtype: string
- name: subject
dtype: string
- name: subject_category
dtype: string
splits:
- name: dev
num_bytes: 2053
num_examples: 6
- name: test
num_bytes: 607215
num_examples: 428
download_size: 94031
dataset_size: 609268
- config_name: thai
features:
- name: question_text
dtype: string
- name: background
dtype: string
- name: answer_text
dtype: string
- name: options
sequence: string
- name: language
dtype: string
- name: level
dtype: string
- name: subject
dtype: string
- name: subject_category
dtype: string
splits:
- name: dev
num_bytes: 16185
num_examples: 26
- name: test
num_bytes: 2249737
num_examples: 2168
download_size: 901256
dataset_size: 2265922
- config_name: vietnamese
features:
- name: question_text
dtype: string
- name: background
dtype: string
- name: answer_text
dtype: string
- name: options
sequence: string
- name: language
dtype: string
- name: level
dtype: string
- name: subject
dtype: string
- name: subject_category
dtype: string
splits:
- name: dev
num_bytes: 7974
num_examples: 28
- name: test
num_bytes: 767759
num_examples: 1789
download_size: 375774
dataset_size: 775733
configs:
- config_name: afrikaans
data_files:
- split: dev
path: afrikaans/dev-*
- split: test
path: afrikaans/test-*
- config_name: chinese
data_files:
- split: dev
path: chinese/dev-*
- split: test
path: chinese/test-*
- config_name: english
data_files:
- split: dev
path: english/dev-*
- split: test
path: english/test-*
- config_name: italian
data_files:
- split: dev
path: italian/dev-*
- split: test
path: italian/test-*
- config_name: javanese
data_files:
- split: dev
path: javanese/dev-*
- split: test
path: javanese/test-*
- config_name: portuguese
data_files:
- split: dev
path: portuguese/dev-*
- split: test
path: portuguese/test-*
- config_name: swahili
data_files:
- split: dev
path: swahili/dev-*
- split: test
path: swahili/test-*
- config_name: thai
data_files:
- split: dev
path: thai/dev-*
- split: test
path: thai/test-*
- config_name: vietnamese
data_files:
- split: dev
path: vietnamese/dev-*
- split: test
path: vietnamese/test-*
---
# M3Exam: A Multilingual 🌏, Multimodal 🖼, Multilevel 📈 Benchmark for LLMs
This is the repository for [M3Exam: A Multilingual, Multimodal, Multilevel Benchmark for Examining Large Language Models](https://arxiv.org/abs/2306.05179/).
TL;DR: We introduce M3Exam, a novel benchmark sourced from real and official human exam questions for evaluating LLMs in a multilingual, multimodal, and multilevel context.

|
open-llm-leaderboard/details_openbmb__UltraLM-13b-v2.0 | 2023-10-09T08:35:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of openbmb/UltraLM-13b-v2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openbmb/UltraLM-13b-v2.0](https://huggingface.co/openbmb/UltraLM-13b-v2.0) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openbmb__UltraLM-13b-v2.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T08:34:12.309014](https://huggingface.co/datasets/open-llm-leaderboard/details_openbmb__UltraLM-13b-v2.0/blob/main/results_2023-10-09T08-34-12.309014.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5630476446912529,\n\
\ \"acc_stderr\": 0.034298942974256355,\n \"acc_norm\": 0.5670632198379092,\n\
\ \"acc_norm_stderr\": 0.03427817426627408,\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677154,\n \"mc2\": 0.4948303207858797,\n\
\ \"mc2_stderr\": 0.015361047603025122\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.01413770860175909\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6172077275443139,\n\
\ \"acc_stderr\": 0.00485074868785994,\n \"acc_norm\": 0.8148775144393547,\n\
\ \"acc_norm_stderr\": 0.0038760312505449856\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483205,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.03794012674697031,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.03794012674697031\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842507,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842507\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n\
\ \"acc_stderr\": 0.0266620105785671,\n \"acc_norm\": 0.6741935483870968,\n\
\ \"acc_norm_stderr\": 0.0266620105785671\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.030031147977641538,\n\
\ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.030031147977641538\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.02534967290683866,\n \
\ \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.02534967290683866\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7522935779816514,\n \"acc_stderr\": 0.01850814360254783,\n \"\
acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.01850814360254783\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695063,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695063\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652244,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652244\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7484035759897829,\n\
\ \"acc_stderr\": 0.015517322365529641,\n \"acc_norm\": 0.7484035759897829,\n\
\ \"acc_norm_stderr\": 0.015517322365529641\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.0258622018522779,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.0258622018522779\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n\
\ \"acc_stderr\": 0.01639222189940707,\n \"acc_norm\": 0.4011173184357542,\n\
\ \"acc_norm_stderr\": 0.01639222189940707\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.02718449890994162,\n\
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.02718449890994162\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.02709865262130175,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.02709865262130175\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719964,\n\
\ \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719964\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42242503259452413,\n\
\ \"acc_stderr\": 0.012615600475734921,\n \"acc_norm\": 0.42242503259452413,\n\
\ \"acc_norm_stderr\": 0.012615600475734921\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.030042615832714857,\n\
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.030042615832714857\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5866013071895425,\n \"acc_stderr\": 0.01992211568278669,\n \
\ \"acc_norm\": 0.5866013071895425,\n \"acc_norm_stderr\": 0.01992211568278669\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677154,\n \"mc2\": 0.4948303207858797,\n\
\ \"mc2_stderr\": 0.015361047603025122\n }\n}\n```"
repo_url: https://huggingface.co/openbmb/UltraLM-13b-v2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|arc:challenge|25_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hellaswag|10_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T08-34-12.309014.parquet'
- config_name: results
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- results_2023-10-09T08-34-12.309014.parquet
- split: latest
path:
- results_2023-10-09T08-34-12.309014.parquet
---
# Dataset Card for Evaluation run of openbmb/UltraLM-13b-v2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openbmb/UltraLM-13b-v2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openbmb/UltraLM-13b-v2.0](https://huggingface.co/openbmb/UltraLM-13b-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openbmb__UltraLM-13b-v2.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T08:34:12.309014](https://huggingface.co/datasets/open-llm-leaderboard/details_openbmb__UltraLM-13b-v2.0/blob/main/results_2023-10-09T08-34-12.309014.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5630476446912529,
"acc_stderr": 0.034298942974256355,
"acc_norm": 0.5670632198379092,
"acc_norm_stderr": 0.03427817426627408,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677154,
"mc2": 0.4948303207858797,
"mc2_stderr": 0.015361047603025122
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.01413770860175909
},
"harness|hellaswag|10": {
"acc": 0.6172077275443139,
"acc_stderr": 0.00485074868785994,
"acc_norm": 0.8148775144393547,
"acc_norm_stderr": 0.0038760312505449856
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483205,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.03794012674697031,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.03794012674697031
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843671,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843671
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.02422996529842507,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.02422996529842507
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.0266620105785671,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.0266620105785671
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.030031147977641538,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.030031147977641538
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5051282051282051,
"acc_stderr": 0.02534967290683866,
"acc_norm": 0.5051282051282051,
"acc_norm_stderr": 0.02534967290683866
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.542016806722689,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.542016806722689,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.01850814360254783,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.01850814360254783
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695063,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695063
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134986,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134986
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652244,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652244
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7484035759897829,
"acc_stderr": 0.015517322365529641,
"acc_norm": 0.7484035759897829,
"acc_norm_stderr": 0.015517322365529641
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.0258622018522779,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.0258622018522779
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4011173184357542,
"acc_stderr": 0.01639222189940707,
"acc_norm": 0.4011173184357542,
"acc_norm_stderr": 0.01639222189940707
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.02718449890994162,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.02718449890994162
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.02709865262130175,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.02709865262130175
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719964,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719964
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42242503259452413,
"acc_stderr": 0.012615600475734921,
"acc_norm": 0.42242503259452413,
"acc_norm_stderr": 0.012615600475734921
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.030042615832714857,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.030042615832714857
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5866013071895425,
"acc_stderr": 0.01992211568278669,
"acc_norm": 0.5866013071895425,
"acc_norm_stderr": 0.01992211568278669
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677154,
"mc2": 0.4948303207858797,
"mc2_stderr": 0.015361047603025122
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
RorooroR/JazzHiphop_64 | 2023-10-09T08:51:06.000Z | [
"region:us"
] | RorooroR | null | null | null | 0 | 0 | Entry not found |
ZeDzZoo/models | 2023-10-09T08:56:06.000Z | [
"region:us"
] | ZeDzZoo | null | null | null | 0 | 0 | Entry not found |
ssfei81/CLIRMatrix | 2023-10-09T10:36:55.000Z | [
"region:us"
] | ssfei81 | null | null | null | 0 | 0 | Entry not found |
bjoernp/mistral_captions | 2023-10-09T09:16:24.000Z | [
"license:apache-2.0",
"region:us"
] | bjoernp | null | null | null | 0 | 0 | ---
license: apache-2.0
configs:
- config_name: default
data_files: captions.jsonl
---
|
TERANTE-MAYITBE/STABLE_DIFFUSION_1.5V_MODEL | 2023-10-10T06:17:47.000Z | [
"license:cc-by-nc-4.0",
"region:us"
] | TERANTE-MAYITBE | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
---
|
open-llm-leaderboard/details_Yukang__LongAlpaca-13B | 2023-10-09T09:21:17.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Yukang/LongAlpaca-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yukang/LongAlpaca-13B](https://huggingface.co/Yukang/LongAlpaca-13B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__LongAlpaca-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T09:19:51.890196](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__LongAlpaca-13B/blob/main/results_2023-10-09T09-19-51.890196.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.353729370378511,\n\
\ \"acc_stderr\": 0.03430719415150413,\n \"acc_norm\": 0.3566825682662924,\n\
\ \"acc_norm_stderr\": 0.034299138855484475,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.01507721920066259,\n \"mc2\": 0.36846872335381525,\n\
\ \"mc2_stderr\": 0.015219839654834443\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4232081911262799,\n \"acc_stderr\": 0.014438036220848029,\n\
\ \"acc_norm\": 0.4257679180887372,\n \"acc_norm_stderr\": 0.014449464278868807\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5485958972316272,\n\
\ \"acc_stderr\": 0.004966158142645408,\n \"acc_norm\": 0.7202748456482773,\n\
\ \"acc_norm_stderr\": 0.004479467619464782\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n\
\ \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.34074074074074073,\n\
\ \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.26973684210526316,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.39245283018867927,\n \"acc_stderr\": 0.03005258057955784,\n\
\ \"acc_norm\": 0.39245283018867927,\n \"acc_norm_stderr\": 0.03005258057955784\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2832369942196532,\n\
\ \"acc_stderr\": 0.03435568056047874,\n \"acc_norm\": 0.2832369942196532,\n\
\ \"acc_norm_stderr\": 0.03435568056047874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.19298245614035087,\n\
\ \"acc_stderr\": 0.037124548537213684,\n \"acc_norm\": 0.19298245614035087,\n\
\ \"acc_norm_stderr\": 0.037124548537213684\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.038552896163789464,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.038552896163789464\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3387096774193548,\n\
\ \"acc_stderr\": 0.026923446059302848,\n \"acc_norm\": 0.3387096774193548,\n\
\ \"acc_norm_stderr\": 0.026923446059302848\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.18226600985221675,\n \"acc_stderr\": 0.02716334085964515,\n\
\ \"acc_norm\": 0.18226600985221675,\n \"acc_norm_stderr\": 0.02716334085964515\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.03895658065271846,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.03895658065271846\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3787878787878788,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.3787878787878788,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5233160621761658,\n \"acc_stderr\": 0.03604513672442202,\n\
\ \"acc_norm\": 0.5233160621761658,\n \"acc_norm_stderr\": 0.03604513672442202\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3128205128205128,\n \"acc_stderr\": 0.023507579020645354,\n\
\ \"acc_norm\": 0.3128205128205128,\n \"acc_norm_stderr\": 0.023507579020645354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.02606715922227579,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02606715922227579\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3403361344537815,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.3403361344537815,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.42201834862385323,\n \"acc_stderr\": 0.021174991407763178,\n \"\
acc_norm\": 0.42201834862385323,\n \"acc_norm_stderr\": 0.021174991407763178\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536023,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536023\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.46078431372549017,\n \"acc_stderr\": 0.03498501649369527,\n \"\
acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.03498501649369527\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4767932489451477,\n \"acc_stderr\": 0.03251215201141017,\n \
\ \"acc_norm\": 0.4767932489451477,\n \"acc_norm_stderr\": 0.03251215201141017\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.366412213740458,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.366412213740458,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.512396694214876,\n \"acc_stderr\": 0.04562951548180765,\n \"acc_norm\"\
: 0.512396694214876,\n \"acc_norm_stderr\": 0.04562951548180765\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.39814814814814814,\n\
\ \"acc_stderr\": 0.04732332615978815,\n \"acc_norm\": 0.39814814814814814,\n\
\ \"acc_norm_stderr\": 0.04732332615978815\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3619631901840491,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.3619631901840491,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4563106796116505,\n \"acc_stderr\": 0.04931801994220414,\n\
\ \"acc_norm\": 0.4563106796116505,\n \"acc_norm_stderr\": 0.04931801994220414\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4700854700854701,\n\
\ \"acc_stderr\": 0.032697411068124425,\n \"acc_norm\": 0.4700854700854701,\n\
\ \"acc_norm_stderr\": 0.032697411068124425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4827586206896552,\n\
\ \"acc_stderr\": 0.017869330154003698,\n \"acc_norm\": 0.4827586206896552,\n\
\ \"acc_norm_stderr\": 0.017869330154003698\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.315028901734104,\n \"acc_stderr\": 0.0250093137900697,\n\
\ \"acc_norm\": 0.315028901734104,\n \"acc_norm_stderr\": 0.0250093137900697\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.34967320261437906,\n \"acc_stderr\": 0.027305308076274702,\n\
\ \"acc_norm\": 0.34967320261437906,\n \"acc_norm_stderr\": 0.027305308076274702\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.34726688102893893,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.34726688102893893,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.35802469135802467,\n \"acc_stderr\": 0.026675611926037093,\n\
\ \"acc_norm\": 0.35802469135802467,\n \"acc_norm_stderr\": 0.026675611926037093\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.30851063829787234,\n \"acc_stderr\": 0.02755336616510136,\n \
\ \"acc_norm\": 0.30851063829787234,\n \"acc_norm_stderr\": 0.02755336616510136\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3089960886571056,\n\
\ \"acc_stderr\": 0.011801729777239246,\n \"acc_norm\": 0.3089960886571056,\n\
\ \"acc_norm_stderr\": 0.011801729777239246\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.30718954248366015,\n \"acc_stderr\": 0.018663359671463677,\n \
\ \"acc_norm\": 0.30718954248366015,\n \"acc_norm_stderr\": 0.018663359671463677\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252088,\n \"acc_norm\": 0.32727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252088\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4326530612244898,\n \"acc_stderr\": 0.03171752824062664,\n\
\ \"acc_norm\": 0.4326530612244898,\n \"acc_norm_stderr\": 0.03171752824062664\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.39800995024875624,\n\
\ \"acc_stderr\": 0.03461199429040013,\n \"acc_norm\": 0.39800995024875624,\n\
\ \"acc_norm_stderr\": 0.03461199429040013\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n\
\ \"acc_stderr\": 0.037658451171688624,\n \"acc_norm\": 0.37349397590361444,\n\
\ \"acc_norm_stderr\": 0.037658451171688624\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.03733756969066164,\n\
\ \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.03733756969066164\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.01507721920066259,\n \"mc2\": 0.36846872335381525,\n\
\ \"mc2_stderr\": 0.015219839654834443\n }\n}\n```"
repo_url: https://huggingface.co/Yukang/LongAlpaca-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|arc:challenge|25_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hellaswag|10_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T09-19-51.890196.parquet'
- config_name: results
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- results_2023-10-09T09-19-51.890196.parquet
- split: latest
path:
- results_2023-10-09T09-19-51.890196.parquet
---
# Dataset Card for Evaluation run of Yukang/LongAlpaca-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yukang/LongAlpaca-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yukang/LongAlpaca-13B](https://huggingface.co/Yukang/LongAlpaca-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yukang__LongAlpaca-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T09:19:51.890196](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__LongAlpaca-13B/blob/main/results_2023-10-09T09-19-51.890196.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.353729370378511,
"acc_stderr": 0.03430719415150413,
"acc_norm": 0.3566825682662924,
"acc_norm_stderr": 0.034299138855484475,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.36846872335381525,
"mc2_stderr": 0.015219839654834443
},
"harness|arc:challenge|25": {
"acc": 0.4232081911262799,
"acc_stderr": 0.014438036220848029,
"acc_norm": 0.4257679180887372,
"acc_norm_stderr": 0.014449464278868807
},
"harness|hellaswag|10": {
"acc": 0.5485958972316272,
"acc_stderr": 0.004966158142645408,
"acc_norm": 0.7202748456482773,
"acc_norm_stderr": 0.004479467619464782
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.26973684210526316,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.26973684210526316,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.39245283018867927,
"acc_stderr": 0.03005258057955784,
"acc_norm": 0.39245283018867927,
"acc_norm_stderr": 0.03005258057955784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03800968060554858,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03800968060554858
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.03435568056047874,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.03435568056047874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.19298245614035087,
"acc_stderr": 0.037124548537213684,
"acc_norm": 0.19298245614035087,
"acc_norm_stderr": 0.037124548537213684
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.038552896163789464,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.038552896163789464
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3387096774193548,
"acc_stderr": 0.026923446059302848,
"acc_norm": 0.3387096774193548,
"acc_norm_stderr": 0.026923446059302848
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18226600985221675,
"acc_stderr": 0.02716334085964515,
"acc_norm": 0.18226600985221675,
"acc_norm_stderr": 0.02716334085964515
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.03895658065271846,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.03895658065271846
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3787878787878788,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.3787878787878788,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5233160621761658,
"acc_stderr": 0.03604513672442202,
"acc_norm": 0.5233160621761658,
"acc_norm_stderr": 0.03604513672442202
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3128205128205128,
"acc_stderr": 0.023507579020645354,
"acc_norm": 0.3128205128205128,
"acc_norm_stderr": 0.023507579020645354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.02606715922227579,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.02606715922227579
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3403361344537815,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.3403361344537815,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.42201834862385323,
"acc_stderr": 0.021174991407763178,
"acc_norm": 0.42201834862385323,
"acc_norm_stderr": 0.021174991407763178
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536023,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536023
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.03498501649369527,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.03498501649369527
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4767932489451477,
"acc_stderr": 0.03251215201141017,
"acc_norm": 0.4767932489451477,
"acc_norm_stderr": 0.03251215201141017
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.366412213740458,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.366412213740458,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.512396694214876,
"acc_stderr": 0.04562951548180765,
"acc_norm": 0.512396694214876,
"acc_norm_stderr": 0.04562951548180765
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.04732332615978815,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.04732332615978815
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3619631901840491,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.3619631901840491,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.4563106796116505,
"acc_stderr": 0.04931801994220414,
"acc_norm": 0.4563106796116505,
"acc_norm_stderr": 0.04931801994220414
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4700854700854701,
"acc_stderr": 0.032697411068124425,
"acc_norm": 0.4700854700854701,
"acc_norm_stderr": 0.032697411068124425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.017869330154003698,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.017869330154003698
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.315028901734104,
"acc_stderr": 0.0250093137900697,
"acc_norm": 0.315028901734104,
"acc_norm_stderr": 0.0250093137900697
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.34967320261437906,
"acc_stderr": 0.027305308076274702,
"acc_norm": 0.34967320261437906,
"acc_norm_stderr": 0.027305308076274702
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.34726688102893893,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.34726688102893893,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.35802469135802467,
"acc_stderr": 0.026675611926037093,
"acc_norm": 0.35802469135802467,
"acc_norm_stderr": 0.026675611926037093
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30851063829787234,
"acc_stderr": 0.02755336616510136,
"acc_norm": 0.30851063829787234,
"acc_norm_stderr": 0.02755336616510136
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3089960886571056,
"acc_stderr": 0.011801729777239246,
"acc_norm": 0.3089960886571056,
"acc_norm_stderr": 0.011801729777239246
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40808823529411764,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.40808823529411764,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.30718954248366015,
"acc_stderr": 0.018663359671463677,
"acc_norm": 0.30718954248366015,
"acc_norm_stderr": 0.018663359671463677
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.04494290866252088,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.04494290866252088
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4326530612244898,
"acc_stderr": 0.03171752824062664,
"acc_norm": 0.4326530612244898,
"acc_norm_stderr": 0.03171752824062664
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.39800995024875624,
"acc_stderr": 0.03461199429040013,
"acc_norm": 0.39800995024875624,
"acc_norm_stderr": 0.03461199429040013
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.037658451171688624,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.037658451171688624
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.03733756969066164,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.03733756969066164
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.36846872335381525,
"mc2_stderr": 0.015219839654834443
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_IkariDev__Athena-v4 | 2023-10-09T09:31:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of IkariDev/Athena-v4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [IkariDev/Athena-v4](https://huggingface.co/IkariDev/Athena-v4) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_IkariDev__Athena-v4\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T09:29:40.768179](https://huggingface.co/datasets/open-llm-leaderboard/details_IkariDev__Athena-v4/blob/main/results_2023-10-09T09-29-40.768179.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5748707164895506,\n\
\ \"acc_stderr\": 0.03415536790442921,\n \"acc_norm\": 0.578705233373212,\n\
\ \"acc_norm_stderr\": 0.034132574147301115,\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262255,\n \"mc2\": 0.5086973486972686,\n\
\ \"mc2_stderr\": 0.015551447041219538\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5895904436860068,\n \"acc_stderr\": 0.014374922192642664,\n\
\ \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.01414419347189345\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6514638518223461,\n\
\ \"acc_stderr\": 0.00475532924397667,\n \"acc_norm\": 0.8418641704839673,\n\
\ \"acc_norm_stderr\": 0.003641226294167795\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.0403356566784832,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.0403356566784832\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955784,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955784\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.04032999053960718,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.04032999053960718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087764,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087764\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819064,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819064\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n\
\ \"acc_stderr\": 0.02704574657353433,\n \"acc_norm\": 0.6548387096774193,\n\
\ \"acc_norm_stderr\": 0.02704574657353433\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548064,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548064\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5435897435897435,\n \"acc_stderr\": 0.025254485424799605,\n\
\ \"acc_norm\": 0.5435897435897435,\n \"acc_norm_stderr\": 0.025254485424799605\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7541284403669725,\n \"acc_stderr\": 0.018461940968708433,\n \"\
acc_norm\": 0.7541284403669725,\n \"acc_norm_stderr\": 0.018461940968708433\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653064,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653064\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.014805384478371155,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.014805384478371155\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584187,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584187\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.488268156424581,\n\
\ \"acc_stderr\": 0.01671789767693216,\n \"acc_norm\": 0.488268156424581,\n\
\ \"acc_norm_stderr\": 0.01671789767693216\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776165,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.026517597724465013,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.026517597724465013\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.02955545423677886,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.02955545423677886\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42959582790091266,\n\
\ \"acc_stderr\": 0.012643004623790206,\n \"acc_norm\": 0.42959582790091266,\n\
\ \"acc_norm_stderr\": 0.012643004623790206\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5915032679738562,\n \"acc_stderr\": 0.019886221037501865,\n \
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.019886221037501865\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.03055531675557364,\n\
\ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.03055531675557364\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355558,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262255,\n \"mc2\": 0.5086973486972686,\n\
\ \"mc2_stderr\": 0.015551447041219538\n }\n}\n```"
repo_url: https://huggingface.co/IkariDev/Athena-v4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|arc:challenge|25_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hellaswag|10_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T09-29-40.768179.parquet'
- config_name: results
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- results_2023-10-09T09-29-40.768179.parquet
- split: latest
path:
- results_2023-10-09T09-29-40.768179.parquet
---
# Dataset Card for Evaluation run of IkariDev/Athena-v4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/IkariDev/Athena-v4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [IkariDev/Athena-v4](https://huggingface.co/IkariDev/Athena-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_IkariDev__Athena-v4",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T09:29:40.768179](https://huggingface.co/datasets/open-llm-leaderboard/details_IkariDev__Athena-v4/blob/main/results_2023-10-09T09-29-40.768179.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5748707164895506,
"acc_stderr": 0.03415536790442921,
"acc_norm": 0.578705233373212,
"acc_norm_stderr": 0.034132574147301115,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262255,
"mc2": 0.5086973486972686,
"mc2_stderr": 0.015551447041219538
},
"harness|arc:challenge|25": {
"acc": 0.5895904436860068,
"acc_stderr": 0.014374922192642664,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.01414419347189345
},
"harness|hellaswag|10": {
"acc": 0.6514638518223461,
"acc_stderr": 0.00475532924397667,
"acc_norm": 0.8418641704839673,
"acc_norm_stderr": 0.003641226294167795
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955784,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.04032999053960718,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.04032999053960718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087764,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087764
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819064,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819064
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.02704574657353433,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.02704574657353433
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.035679697722680495,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.035679697722680495
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548064,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548064
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5435897435897435,
"acc_stderr": 0.025254485424799605,
"acc_norm": 0.5435897435897435,
"acc_norm_stderr": 0.025254485424799605
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413926,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7541284403669725,
"acc_stderr": 0.018461940968708433,
"acc_norm": 0.7541284403669725,
"acc_norm_stderr": 0.018461940968708433
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653064,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653064
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371155,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371155
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584187,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584187
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.488268156424581,
"acc_stderr": 0.01671789767693216,
"acc_norm": 0.488268156424581,
"acc_norm_stderr": 0.01671789767693216
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776165,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.026517597724465013,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.026517597724465013
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.02955545423677886,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.02955545423677886
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42959582790091266,
"acc_stderr": 0.012643004623790206,
"acc_norm": 0.42959582790091266,
"acc_norm_stderr": 0.012643004623790206
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.019886221037501865,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.019886221037501865
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.03055531675557364,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.03055531675557364
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355558,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355558
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262255,
"mc2": 0.5086973486972686,
"mc2_stderr": 0.015551447041219538
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Riksarkivet/Trolldomskommissionen_Text_Recognition | 2023-10-09T09:59:17.000Z | [
"region:us"
] | Riksarkivet | null | null | null | 0 | 0 | Entry not found |
cambridgeltl/posqa | 2023-10-09T09:40:22.000Z | [
"license:apache-2.0",
"region:us"
] | cambridgeltl | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
yw26342/test | 2023-10-09T13:28:37.000Z | [
"task_categories:image-classification",
"size_categories:100K<n<1M",
"language:en",
"license:unknown",
"region:us"
] | yw26342 | null | null | null | 0 | 0 | ---
license: unknown
task_categories:
- image-classification
language:
- en
pretty_name: Food-500
size_categories:
- 100K<n<1M
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Abalone
'1': Aburaage
'2': Æbleflæsk
'3': Agedashi_tofu
'4': Aligot
'5': Almond_biscuit
'6': Aloo_gobi
'7': Ambrosia_(fruit_salad)
'8': Anago
'9': Andong_jjimdak
'10': Ants_climbing_a_tree
'11': Anzac_biscuit
'12': Apple_cobbler
'13': Apple_crisp
'14': Apple_dumpling
'15': Apple_pie
'16': Apple_sauce
'17': Apple_strudel
'18': Applesauce_cake
'19': Arab_salad
'20': Arancini
'21': Arròs_negre
'22': Asam_pedas
'23': Avocado_bread
'24': Avocado_fries
'25': Avocado_key_lime_pie
'26': Avocado_sauce
'27': Avocado_toast
'28': Ayam_bakar
'29': Ayam_betutu
'30': Ayam_goreng
'31': Babi_panggang
'32': Bacon_and_eggs
'33': Bacon_egg_and_cheese_sandwich
'34': Bacon_explosion
'35': Bacon_sandwich
'36': Baingan_bharta
'37': Baked_beans
'38': Banana_bread
'39': Banana_pancakes
'40': Banana_pudding
'41': Banana_salad
'42': Banana_split
'43': Bandeja_paisa
'44': Bangers_and_mash
'45': Banh_bo
'46': Bánh_canh
'47': Bánh_chuối
'48': Bean_pie
'49': Bean_salad
'50': Beef_bun
'51': Beef_chow_fun
'52': Beef_noodle_soup
'53': Beef_Wellington
'54': Beignet
'55': Bibimbap
'56': Bibingka
'57': Biryani
'58': Bisque_food
'59': Bistek
'60': Black_Forest_gateau
'61': Black_pepper_crab
'62': Blackberry_pie
'63': Blancmange
'64': BLT_sandwich
'65': Blueberry_pie
'66': Boiled_beef
'67': Boiled_peanuts
'68': Bon_bon_chicken
'69': Borscht
'70': Botan-ebi
'71': Botifarra
'72': Bouillabaisse
'73': Bourbon_chicken
'74': Bread_roll
'75': Broccoli_slaw
'76': Brown_stew_chicken
'77': Bruschetta
'78': Bubur_ayam
'79': Bubur_kacang_hijau
'80': Bugnes
'81': Bun_bo_hue
'82': Bún_ốc
'83': Bún_riêu
'84': Burrito
'85': Butadon
'86': Butter_cake
'87': Butter_pecan
'88': Butter_pie
'89': Buttercream
'90': Buttery_bread
'91': Cabbage_soup
'92': Caesar_salad
'93': Café_liégeois
'94': Calamari
'95': Caldo_verde
'96': Caprese_salad
'97': Carpaccio
'98': Carrot_bread
'99': Carrot_cake
'100': Carrot_cake_cookie
'101': Carrot_chips
'102': Carrot_juice
'103': Carrot_salad
'104': Carrot_soup
'105': Cassoulet
'106': Cebolada
'107': Ceviche
'108': Chakli
'109': Champurrado
'110': Char_kway_teow
'111': Charqui
'112': Chazuke
'113': Cheese_and_tomato_sandwich
'114': Cheese_bun
'115': Cheese_fries
'116': Cheese_maki
'117': Cheese_pudding
'118': Cheese_roll
'119': Cheesecake
'120': Cherry_cake
'121': Cherry_ice_cream
'122': Cherry_pie
'123': Chicken_and_waffles
'124': Chicken_cacciatore
'125': Chicken_curry
'126': Chicken_fingers
'127': Chicken_Kiev
'128': Chicken_marsala
'129': Chicken_noodle_soup
'130': Chicken_nugget
'131': Chicken_parmigiana
'132': Chicken_salad
'133': Chicken_soup
'134': Chiles_en_nogada
'135': Chili_con_carne
'136': Chilli_crab
'137': Chinese_chicken_salad
'138': Chocolate_covered_strawberries
'139': Chocolate_marshmallow_pie
'140': Chow_mein
'141': Chowder
'142': Christmas_cake
'143': Churro
'144': Ciambelle
'145': Çiğ_köfte
'146': Clam_chowder
'147': Claypot_chicken_rice
'148': Club_sandwich
'149': Cobbler_food
'150': Coconut_cake
'151': Coconut_cream
'152': Coconut_shrimp
'153': Coda_alla_vaccinara
'154': Coffee_jelly
'155': Colcannon
'156': Coleslaw
'157': Congee
'158': Corn_chip
'159': Corn_crab_soup
'160': Corn_dog
'161': Corn_soup
'162': Corn_stew
'163': Corn_tortilla
'164': Corned_beef_sandwich
'165': Cotoletta
'166': Crab_cake
'167': Crab_dip
'168': Crab_in_Padang_sauce
'169': Crab_puff
'170': Crab_rangoon
'171': Cream_of_broccoli_soup
'172': Cream_of_mushroom_soup
'173': Cream_pie
'174': Creamed_eggs_on_toast
'175': Croissant
'176': Crudo
'177': Crumble
'178': Cucumber_soup
'179': Cured_salmon
'180': Curry_goat
'181': Curry_Mee
'182': Curry_rice
'183': Currywurst
'184': Dandan_noodles
'185': Deep-fried_peanuts
'186': Denver_omelette
'187': Denver_sandwich
'188': Dhokla
'189': Di_san_xian
'190': Dinuguan
'191': Dirty_Rice
'192': Dong'an_chicken
'193': Doufunao
'194': Doughnut
'195': Drunken_chicken
'196': Drunken_shrimp
'197': Duck_soup_noodles
'198': Edamame
'199': Eggplant_omelette
'200': Eisbein
'201': Es_campur
'202': Es_pisang_ijo
'203': Filet_mignon
'204': Fish_ball
'205': Fish_chowder
'206': Fish_head_casserole
'207': Fish_head_curry
'208': Fish_soup
'209': Fish_steak
'210': Fishcake
'211': Fondue
'212': Francesinha
'213': Frappe
'214': French_onion_soup
'215': French_toast
'216': Fricassee
'217': Fried_aubergine
'218': Fried_cheese
'219': Fried_chicken
'220': Fried_noodles
'221': Fried_onion
'222': Fried_rice
'223': Fried_Tofu
'224': Fruit_curd
'225': Fuqi_feipian
'226': Galbi
'227': Garlic_bread
'228': Gejang
'229': Ginger_beef
'230': Gołąbki
'231': Goulash
'232': Gravlax
'233': Greek_salad
'234': Green_bean_casserole
'235': Green_papaya_salad
'236': Grits
'237': Gulai_Ayam
'238': Gyutan
'239': Hae_mee
'240': Hainanese_chicken_rice
'241': Ham_and_cheese_sandwich
'242': Ham_salad
'243': Ham_sandwich
'244': hamburgers
'245': Har_gow
'246': Hiyayakko
'247': Hokkien_mee
'248': Hongshao_rou
'249': Hoppin'_John
'250': Hot_and_sour_soup
'251': Huangqiao_sesame_cake
'252': Huevos_rancheros
'253': Hummus
'254': Ikan_bakar
'255': Ikan_goreng
'256': Israeli_salad
'257': Italian_tomato_pie
'258': Jajangmyeon
'259': Jambalaya
'260': Jambonneau
'261': Japchae
'262': Jewish_apple_cake
'263': Jiuniang
'264': Kachori
'265': Kadhi
'266': Kakuni
'267': Katsudon
'268': kebab
'269': Kedgeree
'270': Key_lime_pie
'271': Konnyaku
'272': Korean_fried_chicken
'273': Kung_Pao_chicken
'274': Kwetiau_goreng
'275': Laddu
'276': Laksa
'277': Lamb_chop
'278': Lasagna
'279': Lebkuchen
'280': Leek_soup
'281': Lemon_chicken
'282': Lemon_ice_box_pie
'283': Lemon_meringue_pie
'284': Lemon_tart
'285': Lemonade
'286': Lentil_soup
'287': Lettuce_soup
'288': Lo_mein
'289': Lobster_bisque
'290': Lohikeitto
'291': Macaron
'292': Mango_pomelo_sago
'293': Mango_pudding
'294': Mapo_doufu
'295': Meatloaf
'296': Mee_pok
'297': Mee_siam
'298': Melt_sandwich
'299': Menudo
'300': Mì_Quảng
'301': Michigan_hot_dog
'302': Mie_goreng
'303': Milk_Shake
'304': Minestrone
'305': Mirchi_ka_salan
'306': Mission_burrito
'307': Mongolian_beef
'308': Mousse
'309': Mozzarella_sticks
'310': Mushroom_gravy
'311': Mushy_peas
'312': Mutton_handi
'313': Mutton_korma
'314': Nachos
'315': Naengmyeon
'316': Nasi_goreng
'317': Nasi_kuning
'318': Nasi_liwet
'319': Nattō
'320': Noodle_soup
'321': Nuomici
'322': Okinawa_soba
'323': Olivier_salad
'324': Onigiri
'325': Onion_ring
'326': Orange_chicken
'327': Oyster_omelette
'328': Oysters_Rockefeller
'329': Pączki
'330': Paella
'331': Pakora
'332': Panzanella
'333': Pappa_al_pomodoro
'334': Parmigiana
'335': Pasta
'336': Pasta_al_pomodoro
'337': Pasta_alla_Norma
'338': Pasteis_de_Bacalhau
'339': Paupiette
'340': Payasam
'341': Pea_soup
'342': Peanut_butter_cookie
'343': Peanut_soup
'344': Pho
'345': Pickled_cucumber
'346': Pig's_ear
'347': Pig's_organ_soup
'348': Pig's_trotters
'349': Pineapple_tart
'350': Pipikaula
'351': Pizza_carbonara
'352': Pizza_pugliese
'353': Plateau_de_fruits_de_mer
'354': Plum_cake
'355': Pommes_dauphine
'356': Poori
'357': Porchetta
'358': Pork_ball
'359': Pork_blood_soup
'360': Pork_knuckle
'361': Pork_Knuckles_and_Ginger_Stew
'362': Pork_ribs
'363': Pork_tenderloin
'364': Pork_tenderloin_sandwich
'365': Pot_roast
'366': Potato_salad
'367': Poutine
'368': Prawn_cocktail
'369': Prawn_soup
'370': Princess_cake
'371': Profiterole
'372': Pudding_corn
'373': Pulled_pork
'374': Pumpkin_bread
'375': Pumpkin_pie
'376': Quesadilla
'377': Queso_flameado
'378': Quiche_Lorraine
'379': Rack_of_lamb
'380': Raita
'381': Ramen
'382': Red_bean_cake
'383': Red_bean_ice
'384': Red_beans_and_rice
'385': Red_slaw
'386': Refried_beans
'387': Rendang
'388': Rice_and_peas
'389': Rice_ball
'390': Rice_cakes
'391': Rice_pudding
'392': Rogan_josh
'393': Rou_jia_mo
'394': Rouladen
'395': Salmon_pie
'396': Sambar
'397': Samgyetang
'398': Samosa
'399': Sapo_tahu
'400': Sardine
'401': Sashimi
'402': Sauerbraten
'403': Sausage_and_peppers
'404': Sausage_bun
'405': Sausage_roll
'406': Sausage_sandwich
'407': Sauteed_mushrooms
'408': Scampi
'409': Schweinshaxe
'410': Seafood_boil
'411': Seafood_pizza
'412': Sesame_chicken
'413': Shaker_Lemon_Pie
'414': Shanghai_fried_noodles
'415': She-crab_soup
'416': Sheep's_trotters
'417': Shrimp_ball
'418': Shrimp_toast
'419': Shuizhu
'420': Sicilian_orange_salad
'421': Sliced_fish_soup
'422': Smoked_salmon
'423': Soba
'424': Soft-shell_crab
'425': Sopa_de_Mandioca
'426': Soused_herring
'427': Spanish_rice
'428': Spatzle
'429': Spiced_beef
'430': Spinach_salad
'431': Spinach_soup
'432': Squash_soup
'433': Steak_tartare
'434': Steamed_clams
'435': Steamed_meatball
'436': Steamed_rice
'437': Stew_peas
'438': Stinky_tofu
'439': Stir-fried_tomato_and_scrambled_eggs
'440': Strawberry_cake
'441': Strawberry_cheesecake
'442': Strawberry_ice_cream
'443': Strawberry_jam
'444': Strawberry_pie
'445': Strawberry_sundae
'446': Strawberry_tart
'447': Stuffed_avocado
'448': Stuffed_eggplant
'449': Stuffed_mushrooms
'450': Stuffed_peppers
'451': Stuffed_tomatoes
'452': Suanmeitang
'453': Succotash
'454': Suckling_pig
'455': Svíčková
'456': Sweet_potato_pie
'457': Sweet_saffron_rice
'458': Syrniki
'459': Tabbouleh
'460': Tacos
'461': Tafelspitz
'462': Takoyaki
'463': Taro_dumpling
'464': Tinga_dish
'465': Tiramisu
'466': Toad_in_the_hole
'467': Tobiko
'468': Tomato_and_egg_soup
'469': Tomato_bredie
'470': Tomato_omelette
'471': Tomato_soup
'472': Tonkatsu
'473': Tonkotsu_ramen
'474': Torta_caprese
'475': Tostada
'476': Tourtière
'477': Tsukemono
'478': Tuna_fish_sandwich
'479': Tuna_salad
'480': Turnip_cake
'481': Twix
'482': Unagi
'483': Vegetarian_chili
'484': Vichyssoise
'485': Vindaloo
'486': Vitello_tonnato
'487': Wakame
'488': Wenchang_chicken
'489': White_boiled_shrimp
'490': White_cut_chicken
'491': Whitebait
'492': Wonton_noodles
'493': Wotou
'494': Yeung_Chow_fried_rice
'495': Youtiao
'496': Yuxiangrousi
'497': Zhajiangmian
'498': Zongzi
'499': Zurcher_Geschnetzeltes
--- |
bjoernp/laion-2b-mistral_captions-1.3M | 2023-10-09T10:00:11.000Z | [
"region:us"
] | bjoernp | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: TEXT
dtype: string
- name: RESPONSE
dtype: string
- name: captions
sequence: string
splits:
- name: train
num_bytes: 853385896.3491833
num_examples: 1318108
download_size: 540262191
dataset_size: 853385896.3491833
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "laion-2b-mistral_captions-1.3M"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
foureyednymph/introverted | 2023-10-09T10:24:57.000Z | [
"region:us"
] | foureyednymph | null | null | null | 0 | 0 | Entry not found |
Waterfront/test_jsonl | 2023-10-09T10:31:05.000Z | [
"region:us"
] | Waterfront | null | null | null | 0 | 0 | Entry not found |
suminlim/repo_name | 2023-10-09T10:42:36.000Z | [
"region:us"
] | suminlim | null | null | null | 0 | 0 | Entry not found |
suminlim/iherb_items | 2023-10-09T10:42:56.000Z | [
"region:us"
] | suminlim | null | null | null | 0 | 0 | Entry not found |
happylkx/InstructCoder | 2023-10-09T11:28:01.000Z | [
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:en",
"code",
"region:us"
] | happylkx | null | null | null | 1 | 0 | ---
task_categories:
- text-generation
language:
- en
tags:
- code
pretty_name: instruct_coder
size_categories:
- 100K<n<1M
---
<div align="center">
<img src="https://github.com/Happylkx/InstructCoder/raw/main/docs/logo.png">
</div>
<div align="center">
<a href="https://github.com/qishenghu/CodeInstruct/blob/main/CodeInstruct.pdf">Paper</a> |
<a href="https://github.com/qishenghu/CodeInstruct">Code</a> |
<a href="https://happylkx.github.io/InstructCoder/">Blog</a>
<!-- <a href="https://blog.nus.edu.sg/kaixinli/2023/05/23/codeinstruct/">Blog</a> -->
</div>
<!-- | [Checkpoints](link_to_checkpoints) -->
# InstructCoder (CodeInstruct): Empowering Language Models to Edit Code
## Updates
- May 23, 2023: Paper, code and data released.
## Overview
InstructCoder is the first dataset designed to adapt LLMs for general code editing. It consists of 114,239 instruction-input-output triplets and covers multiple distinct code editing scenarios, generated by ChatGPT. LLaMA-33B finetuned on InstructCoder performs on par with ChatGPT on a real-world test set derived from GitHub commits.

In the ever-evolving world of software development, efficient and effective code editing plays a pivotal role. As the demands for faster development cycles and increased productivity continue to rise, harnessing the power of Large Language Models in code editing has emerged as a game-changer. This project aims to revolutionize code editing, empowering developers with intelligent tools and techniques.
Though the community has studied extensively on coding with LLMs, our experiments demonstrate that current LLMs still struggle for code editing. To this end, we curate a code editing dataset, dubbed InstructCoder, for improving and evaluating code editing abilities of LLMs. InstructCoder is an instructional dataset containing diverse code-editing tasks, covering 20 scenarios where code editing may be performed. [The dataset is released here.](https://github.com/Happylkx/InstructCoder) Our empirical experiments reveal that open-source LLMs display notable gains in code editing abilities after finetuning on InstructCoder. Some qualitative examples are provided.
## Data Collection
InstructCoder is systematically collected through an iterative process. Initial seed tasks are selected from GitHub commits. Inspired by Self-Instruct, we prompt ChatGPT to bootstrap new instructions. Then, plausible scenarios where the filtered instructions may be used are generated. Finally, corresponding code input and output are obtained conditioned on both the instruction and scenario. High-quality samples are manually selected and recurrently added to the task pool for further generation.
## Results
The largest model fine-tuned, LLaMA-33B, performs on par with ChatGPT, with 79.3% accuracy on the test set. Some qualitative examples generated by our model are shown here.
## Release
We are planning to release the following assets:
- [x] Full dataset
- [x] Source Code
<!-- The LoRA checkpoints are available [here](link_to_checkpoints). -->
## Qualitative Examples
In this section, we showcase three qualitative examples generated by the LLaMA-33B model.
1. **Instruction:** Transform the code into a class method called ‘calculate_distance’ that calculates the distance between two points in 3D space.
**Input code:**
```python
import math
def distance(point1, point2):
x1, y1, z1 = point1
x2, y2, z2 = point2
dx = x2 - x1
dy = y2 - y1
dz = z2 - z1
return math.sqrt(dx**2 + dy**2 + dz**2)
point1 = (0, 0, 0)
point2 = (1, 1, 1)
print(distance(point1, point2))
```
Output code:
```python
import math
class Point:
def __init__(self, x, y, z):
self.x = x
self.y = y
self.z = z
def calculate_distance(self, other_point):
dx = other_point.x - self.x
dy = other_point.y - self.y
dz = other_point.z - self.z
return math.sqrt(dx**2 + dy**2 + dz**2)
point1 = Point(0, 0, 0)
point2 = Point(1, 1, 1)
print(point1.calculate_distance(point2))
```
2. Instruction: Integrate error handling into the code that performs batch processing of large datasets, including job failures or incomplete runs.
Input code:
```python
import pandas as pd
def process_orders(orders_file, shipping_file, inventory_file):
orders = pd.read_csv(orders_file)
shipping = pd.read_csv(shipping_file)
inventory = pd.read_csv(inventory_file)
# Perform data processing and analysis here...
results = {"success": True, "message": "Batch processing completed successfully."}
return results
orders_file = "orders.csv"
shipping_file = "shipping.csv"
inventory_file = "inventory.csv"
results = process_orders(orders_file, shipping_file, inventory_file)
print(results)
```
Output code:
```python
import pandas as pd
import logging
logging.basicConfig(filename='batch_processing.log', level=logging.ERROR)
def process_orders(orders_file, shipping_file, inventory_file):
try:
orders = pd.read_csv(orders_file)
shipping = pd.read_csv(shipping_file)
inventory = pd.read_csv(inventory_file)
# Perform data processing and analysis here...
results = {"success": True, "message": "Batch processing completed successfully."}
return results
except Exception as e:
logging.error(f"Error processing batch: {e}")
results = {"success": False, "message": "Batch processing failed. Please check the input files for any issues."}
return results
orders_file = "orders.csv"
shipping_file = "shipping.csv"
inventory_file = "inventory.csv"
results = process_orders(orders_file, shipping_file, inventory_file)
print(results)
```
3. Instruction: Refactor any code that uses ‘exec’ or ‘eval’ for security reasons.
Input code:
```python
Copy code
game_data = {
"score": 0,
"level": 1
}
def execute_script(script):
exec(script)
script = """
game_data["score"] += 10
if game_data["score"] >= 100:
game_data["level"] += 1
"""
execute_script(script)
print(game_data)
```
Output code:
```python
game_data = {
"score": 0,
"level": 1
}
def execute_script(script):
compiled_script = compile(script, "", "exec")
exec(compiled_script, game_data)
script = """
game_data["score"] += 10
if game_data["score"] >= 100:
game_data["level"] += 1
"""
execute_script(script)
print(game_data)
```
Now the script is not able to access the data in game_data.
## Limitations and Safety Concerns
By recognizing and addressing limitations and safety concerns, we hope developers can foster responsible AI deployment and use, promoting a more ethical, equitable, and secure future for large language models in code production.
1. **License:** You must ensure compliance with the corresponding licenses of the LLMs. It is important to thoroughly review the licenses and make sure that your usage is in accordance with their terms. For instance, you are not allowed to use LLaMA commercially, for it is released under a noncommercial license.
2. **Incomplete or Imperfect Knowledge:** LLMs are trained on vast amounts of data, which may not always be up-to-date or entirely accurate. For example, the APIs of a library may change over time. Consequently, the information provided by the models could be outdated, inaccurate, or even misleading in some instances.
3. **Overuse and Dependency:** Users might incorrectly interpret or rely too heavily on the outputs generated by large language models. It is crucial to provide proper guidance and promote an understanding of the model’s limitations, encouraging users to critically assess and verify the information or suggestions provided. Please make sure to check the generation of the models before using them.
Overreliance on large language models could lead to complacency, potentially causing users to undervalue human intelligence, such as creativity and critical thinking. We encourage users to use AI as a tool to supplement, rather than replace, human input and judgment.
4. **Malicious Use:** There is a risk that malicious actors might use the tools for nefarious purposes, such as generating malicious software. It is important to monitor the use and deployment of these models, track and report abuse, and develop countermeasures to address potential malicious activity.
5. **Bias and Discrimination:** Language models can inherit societal biases present in their training data, possibly leading to discriminatory or biased generations. Though our dataset is not likely to contain such toxic data, they may appear in the responses because of the base LLMs.
## Citation
Feel free to cite our work if you find it interesting or use the data:
```plain
@misc{2023instructcoder,
title={InstructCoder: Empowering Language Models to Edit Code},
url={https://github.com/qishenghu/CodeInstruct},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
author={Hu, Qisheng and Li, Kaixin and Zhao, Xu and Xie, Yuxi and Liu, Tiedong and Chen, Hui and Xie, Qizhe and He, Junxian}}
```
## Conclusion
The integration of AI into code editing represents a significant milestone in the evolution of software development. By leveraging AI’s capabilities in understanding code semantics, patterns, and best practices, developers can unlock new levels of productivity, code quality, and efficiency. This project we’ve explored demonstrates the immense potential of intelligent code editing tools. As the software development landscape continues to evolve, embracing AI is poised to become a standard practice, and sets the stage for a future where developers can focus more on creativity and problem-solving, while AI handles the mundane aspects of coding.
|
promptora11/llama2 | 2023-10-09T11:27:06.000Z | [
"region:us"
] | promptora11 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 338808
num_examples: 200
download_size: 201257
dataset_size: 338808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cheemsdoge/TH11 | 2023-10-09T11:48:03.000Z | [
"task_categories:image-classification",
"size_categories:n<1K",
"language:en",
"license:apache-2.0",
"art",
"region:us"
] | Cheemsdoge | null | null | null | 0 | 0 | ---
license: apache-2.0
task_categories:
- image-classification
language:
- en
tags:
- art
pretty_name: Town Hall 11
size_categories:
- n<1K
--- |
SYSPIN/LIMMITS24_target_speaker_fewshot_samples | 2023-10-09T11:40:32.000Z | [
"license:cc-by-4.0",
"region:us"
] | SYSPIN | null | null | null | 0 | 0 | ---
license: cc-by-4.0
---
|
open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.3 | 2023-10-09T11:59:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of migtissera/SynthIA-7B-v1.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/SynthIA-7B-v1.3](https://huggingface.co/migtissera/SynthIA-7B-v1.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T11:58:55.532772](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.3/blob/main/results_2023-10-09T11-58-55.532772.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6260571760719941,\n\
\ \"acc_stderr\": 0.03318501686714267,\n \"acc_norm\": 0.6299102590458732,\n\
\ \"acc_norm_stderr\": 0.03316308885976868,\n \"mc1\": 0.34149326805385555,\n\
\ \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.5136907901080823,\n\
\ \"mc2_stderr\": 0.015224284656166093\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5853242320819113,\n \"acc_stderr\": 0.014397070564409174,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000326\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6429994025094603,\n\
\ \"acc_stderr\": 0.004781358113341955,\n \"acc_norm\": 0.8344951204939255,\n\
\ \"acc_norm_stderr\": 0.003708760752685524\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7290322580645161,\n\
\ \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.7290322580645161,\n\
\ \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094767,\n\
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094767\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.016465345467391528,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.016465345467391528\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\"\
: 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"\
acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677,\n\
\ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4111731843575419,\n\
\ \"acc_stderr\": 0.016456498033977512,\n \"acc_norm\": 0.4111731843575419,\n\
\ \"acc_norm_stderr\": 0.016456498033977512\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n\
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579921,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579921\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.01273854737130396,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.01273854737130396\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162666,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162666\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34149326805385555,\n\
\ \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.5136907901080823,\n\
\ \"mc2_stderr\": 0.015224284656166093\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/SynthIA-7B-v1.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|arc:challenge|25_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|arc:challenge|25_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hellaswag|10_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hellaswag|10_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T11-58-55.532772.parquet'
- config_name: results
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- results_2023-10-09T11-48-18.823660.parquet
- split: 2023_10_09T11_58_55.532772
path:
- results_2023-10-09T11-58-55.532772.parquet
- split: latest
path:
- results_2023-10-09T11-58-55.532772.parquet
---
# Dataset Card for Evaluation run of migtissera/SynthIA-7B-v1.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/SynthIA-7B-v1.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/SynthIA-7B-v1.3](https://huggingface.co/migtissera/SynthIA-7B-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T11:58:55.532772](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.3/blob/main/results_2023-10-09T11-58-55.532772.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6260571760719941,
"acc_stderr": 0.03318501686714267,
"acc_norm": 0.6299102590458732,
"acc_norm_stderr": 0.03316308885976868,
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.5136907901080823,
"mc2_stderr": 0.015224284656166093
},
"harness|arc:challenge|25": {
"acc": 0.5853242320819113,
"acc_stderr": 0.014397070564409174,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000326
},
"harness|hellaswag|10": {
"acc": 0.6429994025094603,
"acc_stderr": 0.004781358113341955,
"acc_norm": 0.8344951204939255,
"acc_norm_stderr": 0.003708760752685524
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7290322580645161,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.7290322580645161,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094767,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094767
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.016465345467391528,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.016465345467391528
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4111731843575419,
"acc_stderr": 0.016456498033977512,
"acc_norm": 0.4111731843575419,
"acc_norm_stderr": 0.016456498033977512
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508755,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508755
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579921,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579921
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.01273854737130396,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.01273854737130396
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162666,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162666
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.5136907901080823,
"mc2_stderr": 0.015224284656166093
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Praghxx/Praghxxx | 2023-10-09T12:00:07.000Z | [
"license:openrail",
"region:us"
] | Praghxx | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_ehartford__dolphin-2.0-mistral-7b | 2023-10-09T12:07:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ehartford/dolphin-2.0-mistral-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/dolphin-2.0-mistral-7b](https://huggingface.co/ehartford/dolphin-2.0-mistral-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__dolphin-2.0-mistral-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T12:06:26.268228](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.0-mistral-7b/blob/main/results_2023-10-09T12-06-26.268228.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5693671370041259,\n\
\ \"acc_stderr\": 0.034632266162612044,\n \"acc_norm\": 0.5733414756143802,\n\
\ \"acc_norm_stderr\": 0.03461412466327177,\n \"mc1\": 0.4369645042839657,\n\
\ \"mc1_stderr\": 0.01736384450319598,\n \"mc2\": 0.6108755423851862,\n\
\ \"mc2_stderr\": 0.015320480953391271\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5520477815699659,\n \"acc_stderr\": 0.014532011498211681,\n\
\ \"acc_norm\": 0.5921501706484642,\n \"acc_norm_stderr\": 0.014361097288449703\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6082453694483171,\n\
\ \"acc_stderr\": 0.00487144710655492,\n \"acc_norm\": 0.8026289583748257,\n\
\ \"acc_norm_stderr\": 0.003972012855240825\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.03765746693865149,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.03765746693865149\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936337,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6161290322580645,\n\
\ \"acc_stderr\": 0.027666182075539635,\n \"acc_norm\": 0.6161290322580645,\n\
\ \"acc_norm_stderr\": 0.027666182075539635\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406796,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406796\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.0331847733384533,\n \"acc_norm\"\
: 0.6818181818181818,\n \"acc_norm_stderr\": 0.0331847733384533\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296732,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296732\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5435897435897435,\n \"acc_stderr\": 0.025254485424799605,\n\
\ \"acc_norm\": 0.5435897435897435,\n \"acc_norm_stderr\": 0.025254485424799605\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7357798165137615,\n \"acc_stderr\": 0.018904164171510168,\n \"\
acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.018904164171510168\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.02904133351059804,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.02904133351059804\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935573,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935573\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652268,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652268\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7547892720306514,\n\
\ \"acc_stderr\": 0.01538435228454393,\n \"acc_norm\": 0.7547892720306514,\n\
\ \"acc_norm_stderr\": 0.01538435228454393\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.025950054337654075,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.025950054337654075\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3463687150837989,\n\
\ \"acc_stderr\": 0.015913546784020117,\n \"acc_norm\": 0.3463687150837989,\n\
\ \"acc_norm_stderr\": 0.015913546784020117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n\
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751468,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751468\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.595679012345679,\n \"acc_stderr\": 0.027306625297327684,\n\
\ \"acc_norm\": 0.595679012345679,\n \"acc_norm_stderr\": 0.027306625297327684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.408735332464146,\n\
\ \"acc_stderr\": 0.012555701346703384,\n \"acc_norm\": 0.408735332464146,\n\
\ \"acc_norm_stderr\": 0.012555701346703384\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.02976826352893311,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.02976826352893311\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5588235294117647,\n \"acc_stderr\": 0.020087362076702853,\n \
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.020087362076702853\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\
\ \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.7114427860696517,\n\
\ \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4369645042839657,\n\
\ \"mc1_stderr\": 0.01736384450319598,\n \"mc2\": 0.6108755423851862,\n\
\ \"mc2_stderr\": 0.015320480953391271\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/dolphin-2.0-mistral-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-06-26.268228.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- results_2023-10-09T12-06-26.268228.parquet
- split: latest
path:
- results_2023-10-09T12-06-26.268228.parquet
---
# Dataset Card for Evaluation run of ehartford/dolphin-2.0-mistral-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/dolphin-2.0-mistral-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/dolphin-2.0-mistral-7b](https://huggingface.co/ehartford/dolphin-2.0-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__dolphin-2.0-mistral-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T12:06:26.268228](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.0-mistral-7b/blob/main/results_2023-10-09T12-06-26.268228.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5693671370041259,
"acc_stderr": 0.034632266162612044,
"acc_norm": 0.5733414756143802,
"acc_norm_stderr": 0.03461412466327177,
"mc1": 0.4369645042839657,
"mc1_stderr": 0.01736384450319598,
"mc2": 0.6108755423851862,
"mc2_stderr": 0.015320480953391271
},
"harness|arc:challenge|25": {
"acc": 0.5520477815699659,
"acc_stderr": 0.014532011498211681,
"acc_norm": 0.5921501706484642,
"acc_norm_stderr": 0.014361097288449703
},
"harness|hellaswag|10": {
"acc": 0.6082453694483171,
"acc_stderr": 0.00487144710655492,
"acc_norm": 0.8026289583748257,
"acc_norm_stderr": 0.003972012855240825
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.03765746693865149,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.03765746693865149
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.03268335899936337,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.03268335899936337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6161290322580645,
"acc_stderr": 0.027666182075539635,
"acc_norm": 0.6161290322580645,
"acc_norm_stderr": 0.027666182075539635
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406796,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406796
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.0331847733384533,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.0331847733384533
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.028979089794296732,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.028979089794296732
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5435897435897435,
"acc_stderr": 0.025254485424799605,
"acc_norm": 0.5435897435897435,
"acc_norm_stderr": 0.025254485424799605
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7357798165137615,
"acc_stderr": 0.018904164171510168,
"acc_norm": 0.7357798165137615,
"acc_norm_stderr": 0.018904164171510168
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497752,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497752
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935573,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935573
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652268,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652268
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7547892720306514,
"acc_stderr": 0.01538435228454393,
"acc_norm": 0.7547892720306514,
"acc_norm_stderr": 0.01538435228454393
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.025950054337654075,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.025950054337654075
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3463687150837989,
"acc_stderr": 0.015913546784020117,
"acc_norm": 0.3463687150837989,
"acc_norm_stderr": 0.015913546784020117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751468,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751468
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.595679012345679,
"acc_stderr": 0.027306625297327684,
"acc_norm": 0.595679012345679,
"acc_norm_stderr": 0.027306625297327684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.408735332464146,
"acc_stderr": 0.012555701346703384,
"acc_norm": 0.408735332464146,
"acc_norm_stderr": 0.012555701346703384
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.02976826352893311,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.02976826352893311
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.020087362076702853,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.020087362076702853
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106568,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4369645042839657,
"mc1_stderr": 0.01736384450319598,
"mc2": 0.6108755423851862,
"mc2_stderr": 0.015320480953391271
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_ehartford__samantha-mistral-7b | 2023-10-09T12:12:42.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ehartford/samantha-mistral-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/samantha-mistral-7b](https://huggingface.co/ehartford/samantha-mistral-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__samantha-mistral-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T12:11:18.939016](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-mistral-7b/blob/main/results_2023-10-09T12-11-18.939016.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.613800549861161,\n\
\ \"acc_stderr\": 0.03381088594806451,\n \"acc_norm\": 0.617838758010754,\n\
\ \"acc_norm_stderr\": 0.033786866951853775,\n \"mc1\": 0.3084455324357405,\n\
\ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.4608281140841834,\n\
\ \"mc2_stderr\": 0.014866840498694427\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5895904436860068,\n \"acc_stderr\": 0.01437492219264266,\n\
\ \"acc_norm\": 0.6339590443686007,\n \"acc_norm_stderr\": 0.014077223108470139\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6470822545309699,\n\
\ \"acc_stderr\": 0.004769007545082275,\n \"acc_norm\": 0.8409679346743677,\n\
\ \"acc_norm_stderr\": 0.003649585852821188\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432118,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432118\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n\
\ \"acc_stderr\": 0.04928099597287533,\n \"acc_norm\": 0.43137254901960786,\n\
\ \"acc_norm_stderr\": 0.04928099597287533\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n \"\
acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159784,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159784\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7290322580645161,\n\
\ \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.7290322580645161,\n\
\ \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.031353050095330855,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.031353050095330855\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.02460362692409742,\n \
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.02460362692409742\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.02925290592725198,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.02925290592725198\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016005,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016005\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.03409386946992699,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.03409386946992699\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.031911001928357954,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.031911001928357954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652258,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652258\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7918263090676884,\n\
\ \"acc_stderr\": 0.014518592248904033,\n \"acc_norm\": 0.7918263090676884,\n\
\ \"acc_norm_stderr\": 0.014518592248904033\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37094972067039106,\n\
\ \"acc_stderr\": 0.01615591072134177,\n \"acc_norm\": 0.37094972067039106,\n\
\ \"acc_norm_stderr\": 0.01615591072134177\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195462,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195462\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4498044328552803,\n\
\ \"acc_stderr\": 0.012705721498565109,\n \"acc_norm\": 0.4498044328552803,\n\
\ \"acc_norm_stderr\": 0.012705721498565109\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.019291961895066382,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.019291961895066382\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263714,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263714\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727682,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727682\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3084455324357405,\n\
\ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.4608281140841834,\n\
\ \"mc2_stderr\": 0.014866840498694427\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/samantha-mistral-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-11-18.939016.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- results_2023-10-09T12-11-18.939016.parquet
- split: latest
path:
- results_2023-10-09T12-11-18.939016.parquet
---
# Dataset Card for Evaluation run of ehartford/samantha-mistral-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/samantha-mistral-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/samantha-mistral-7b](https://huggingface.co/ehartford/samantha-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__samantha-mistral-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T12:11:18.939016](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-mistral-7b/blob/main/results_2023-10-09T12-11-18.939016.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.613800549861161,
"acc_stderr": 0.03381088594806451,
"acc_norm": 0.617838758010754,
"acc_norm_stderr": 0.033786866951853775,
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.4608281140841834,
"mc2_stderr": 0.014866840498694427
},
"harness|arc:challenge|25": {
"acc": 0.5895904436860068,
"acc_stderr": 0.01437492219264266,
"acc_norm": 0.6339590443686007,
"acc_norm_stderr": 0.014077223108470139
},
"harness|hellaswag|10": {
"acc": 0.6470822545309699,
"acc_stderr": 0.004769007545082275,
"acc_norm": 0.8409679346743677,
"acc_norm_stderr": 0.003649585852821188
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432118,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432118
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159784,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159784
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7290322580645161,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.7290322580645161,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.031353050095330855,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.031353050095330855
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.02460362692409742,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.02460362692409742
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.02925290592725198,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.02925290592725198
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.017324352325016005,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.017324352325016005
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.03409386946992699,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.03409386946992699
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.031911001928357954,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.031911001928357954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652258,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652258
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7918263090676884,
"acc_stderr": 0.014518592248904033,
"acc_norm": 0.7918263090676884,
"acc_norm_stderr": 0.014518592248904033
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424284,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37094972067039106,
"acc_stderr": 0.01615591072134177,
"acc_norm": 0.37094972067039106,
"acc_norm_stderr": 0.01615591072134177
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195462,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195462
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4498044328552803,
"acc_stderr": 0.012705721498565109,
"acc_norm": 0.4498044328552803,
"acc_norm_stderr": 0.012705721498565109
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.019291961895066382,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.019291961895066382
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727682,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727682
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.4608281140841834,
"mc2_stderr": 0.014866840498694427
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b | 2023-10-09T12:18:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ehartford/samantha-mistral-instruct-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/samantha-mistral-instruct-7b](https://huggingface.co/ehartford/samantha-mistral-instruct-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T12:17:25.772796](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b/blob/main/results_2023-10-09T12-17-25.772796.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5175528305757175,\n\
\ \"acc_stderr\": 0.03530048748842879,\n \"acc_norm\": 0.5214842448231866,\n\
\ \"acc_norm_stderr\": 0.035289046419364915,\n \"mc1\": 0.40758873929008566,\n\
\ \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.5880610765476223,\n\
\ \"mc2_stderr\": 0.015477717507753088\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49402730375426623,\n \"acc_stderr\": 0.014610348300255793,\n\
\ \"acc_norm\": 0.5349829351535836,\n \"acc_norm_stderr\": 0.014575583922019675\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5604461262696674,\n\
\ \"acc_stderr\": 0.004953184534223994,\n \"acc_norm\": 0.7514439354710217,\n\
\ \"acc_norm_stderr\": 0.004312925837691541\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.03036505082911521,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.03036505082911521\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n\
\ \"acc_stderr\": 0.038118909889404126,\n \"acc_norm\": 0.4913294797687861,\n\
\ \"acc_norm_stderr\": 0.038118909889404126\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29365079365079366,\n \"acc_stderr\": 0.02345603738398203,\n \"\
acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.02345603738398203\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\
\ \"acc_stderr\": 0.027575960723278226,\n \"acc_norm\": 0.6225806451612903,\n\
\ \"acc_norm_stderr\": 0.027575960723278226\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.034223985656575494,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.034223985656575494\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187898,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187898\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041153,\n\
\ \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041153\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47692307692307695,\n \"acc_stderr\": 0.025323990861736118,\n\
\ \"acc_norm\": 0.47692307692307695,\n \"acc_norm_stderr\": 0.025323990861736118\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6678899082568808,\n \"acc_stderr\": 0.020192682985423326,\n \"\
acc_norm\": 0.6678899082568808,\n \"acc_norm_stderr\": 0.020192682985423326\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.0345423658538061,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.0345423658538061\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6540084388185654,\n \"acc_stderr\": 0.030964810588786716,\n \
\ \"acc_norm\": 0.6540084388185654,\n \"acc_norm_stderr\": 0.030964810588786716\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.515695067264574,\n\
\ \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.515695067264574,\n\
\ \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.040933292298342784,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.040933292298342784\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899616,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041696,\n\
\ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041696\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7307692307692307,\n\
\ \"acc_stderr\": 0.029058588303748845,\n \"acc_norm\": 0.7307692307692307,\n\
\ \"acc_norm_stderr\": 0.029058588303748845\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6845466155810983,\n\
\ \"acc_stderr\": 0.01661750173876339,\n \"acc_norm\": 0.6845466155810983,\n\
\ \"acc_norm_stderr\": 0.01661750173876339\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.02658923114217426,\n\
\ \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.02658923114217426\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28938547486033517,\n\
\ \"acc_stderr\": 0.015166544550490298,\n \"acc_norm\": 0.28938547486033517,\n\
\ \"acc_norm_stderr\": 0.015166544550490298\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.02840830202033269,\n\
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.02840830202033269\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5627009646302251,\n\
\ \"acc_stderr\": 0.028173917761762906,\n \"acc_norm\": 0.5627009646302251,\n\
\ \"acc_norm_stderr\": 0.028173917761762906\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.595679012345679,\n \"acc_stderr\": 0.027306625297327688,\n\
\ \"acc_norm\": 0.595679012345679,\n \"acc_norm_stderr\": 0.027306625297327688\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.02866382014719948,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.02866382014719948\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3468057366362451,\n\
\ \"acc_stderr\": 0.012156071332318713,\n \"acc_norm\": 0.3468057366362451,\n\
\ \"acc_norm_stderr\": 0.012156071332318713\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329383,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329383\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.47549019607843135,\n \"acc_stderr\": 0.020203517280261447,\n \
\ \"acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.020203517280261447\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827424,\n\
\ \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827424\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6169154228855721,\n\
\ \"acc_stderr\": 0.0343751933733825,\n \"acc_norm\": 0.6169154228855721,\n\
\ \"acc_norm_stderr\": 0.0343751933733825\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824564,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824564\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n\
\ \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.5880610765476223,\n\
\ \"mc2_stderr\": 0.015477717507753088\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/samantha-mistral-instruct-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-17-25.772796.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- results_2023-10-09T12-17-25.772796.parquet
- split: latest
path:
- results_2023-10-09T12-17-25.772796.parquet
---
# Dataset Card for Evaluation run of ehartford/samantha-mistral-instruct-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/samantha-mistral-instruct-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/samantha-mistral-instruct-7b](https://huggingface.co/ehartford/samantha-mistral-instruct-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T12:17:25.772796](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b/blob/main/results_2023-10-09T12-17-25.772796.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5175528305757175,
"acc_stderr": 0.03530048748842879,
"acc_norm": 0.5214842448231866,
"acc_norm_stderr": 0.035289046419364915,
"mc1": 0.40758873929008566,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.5880610765476223,
"mc2_stderr": 0.015477717507753088
},
"harness|arc:challenge|25": {
"acc": 0.49402730375426623,
"acc_stderr": 0.014610348300255793,
"acc_norm": 0.5349829351535836,
"acc_norm_stderr": 0.014575583922019675
},
"harness|hellaswag|10": {
"acc": 0.5604461262696674,
"acc_stderr": 0.004953184534223994,
"acc_norm": 0.7514439354710217,
"acc_norm_stderr": 0.004312925837691541
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.03036505082911521,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.03036505082911521
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.038118909889404126,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.038118909889404126
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.02345603738398203,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.02345603738398203
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278226,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278226
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.034223985656575494,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.034223985656575494
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187898,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187898
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7253886010362695,
"acc_stderr": 0.03221024508041153,
"acc_norm": 0.7253886010362695,
"acc_norm_stderr": 0.03221024508041153
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47692307692307695,
"acc_stderr": 0.025323990861736118,
"acc_norm": 0.47692307692307695,
"acc_norm_stderr": 0.025323990861736118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066475,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066475
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4831932773109244,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.4831932773109244,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6678899082568808,
"acc_stderr": 0.020192682985423326,
"acc_norm": 0.6678899082568808,
"acc_norm_stderr": 0.020192682985423326
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.0345423658538061,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.0345423658538061
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6540084388185654,
"acc_stderr": 0.030964810588786716,
"acc_norm": 0.6540084388185654,
"acc_norm_stderr": 0.030964810588786716
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.515695067264574,
"acc_stderr": 0.0335412657542081,
"acc_norm": 0.515695067264574,
"acc_norm_stderr": 0.0335412657542081
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.040933292298342784,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.040933292298342784
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190193,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190193
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899616,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.04750458399041696,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.04750458399041696
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7307692307692307,
"acc_stderr": 0.029058588303748845,
"acc_norm": 0.7307692307692307,
"acc_norm_stderr": 0.029058588303748845
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6845466155810983,
"acc_stderr": 0.01661750173876339,
"acc_norm": 0.6845466155810983,
"acc_norm_stderr": 0.01661750173876339
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.02658923114217426,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.02658923114217426
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28938547486033517,
"acc_stderr": 0.015166544550490298,
"acc_norm": 0.28938547486033517,
"acc_norm_stderr": 0.015166544550490298
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.02840830202033269,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.02840830202033269
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5627009646302251,
"acc_stderr": 0.028173917761762906,
"acc_norm": 0.5627009646302251,
"acc_norm_stderr": 0.028173917761762906
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.595679012345679,
"acc_stderr": 0.027306625297327688,
"acc_norm": 0.595679012345679,
"acc_norm_stderr": 0.027306625297327688
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.02866382014719948,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.02866382014719948
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3468057366362451,
"acc_stderr": 0.012156071332318713,
"acc_norm": 0.3468057366362451,
"acc_norm_stderr": 0.012156071332318713
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329383,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329383
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.020203517280261447,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.020203517280261447
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5918367346938775,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.5918367346938775,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6169154228855721,
"acc_stderr": 0.0343751933733825,
"acc_norm": 0.6169154228855721,
"acc_norm_stderr": 0.0343751933733825
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824564,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40758873929008566,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.5880610765476223,
"mc2_stderr": 0.015477717507753088
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dmrau/cqudubstack-android | 2023-10-09T12:19:38.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 47953
num_examples: 699
- name: corpus
num_bytes: 12840959
num_examples: 22998
download_size: 7657118
dataset_size: 12888912
---
# Dataset Card for "cqudubstack-android"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadubstack-android-qrels | 2023-10-09T12:19:39.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 43411
num_examples: 1696
download_size: 19993
dataset_size: 43411
---
# Dataset Card for "cqadubstack-android-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudubstack-gaming | 2023-10-09T12:19:52.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 105494
num_examples: 1595
- name: corpus
num_bytes: 20666596
num_examples: 45301
download_size: 12946080
dataset_size: 20772090
---
# Dataset Card for "cqudubstack-gaming"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadubstack-gaming-qrels | 2023-10-09T12:19:53.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 60520
num_examples: 2263
download_size: 32524
dataset_size: 60520
---
# Dataset Card for "cqadubstack-gaming-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudubstack-mathematica | 2023-10-09T12:20:00.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 52792
num_examples: 804
- name: corpus
num_bytes: 18735825
num_examples: 16705
download_size: 10393860
dataset_size: 18788617
---
# Dataset Card for "cqudubstack-mathematica"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadubstack-mathematica-qrels | 2023-10-09T12:20:02.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 34691
num_examples: 1358
download_size: 18181
dataset_size: 34691
---
# Dataset Card for "cqadubstack-mathematica-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudubstack-programmers | 2023-10-09T12:20:11.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 63785
num_examples: 876
- name: corpus
num_bytes: 32727262
num_examples: 32176
download_size: 19360000
dataset_size: 32791047
---
# Dataset Card for "cqudubstack-programmers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadubstack-programmers-qrels | 2023-10-09T12:20:13.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 45452
num_examples: 1675
download_size: 22632
dataset_size: 45452
---
# Dataset Card for "cqadubstack-programmers-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudubstack-tex | 2023-10-09T12:21:06.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 186934
num_examples: 2906
- name: corpus
num_bytes: 86600423
num_examples: 68184
download_size: 43424126
dataset_size: 86787357
---
# Dataset Card for "cqudubstack-tex"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadubstack-tex-qrels | 2023-10-09T12:21:08.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 137572
num_examples: 5154
download_size: 67107
dataset_size: 137572
---
# Dataset Card for "cqadubstack-tex-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BroUP/No | 2023-10-09T12:21:07.000Z | [
"license:mit",
"region:us"
] | BroUP | null | null | null | 0 | 0 | ---
license: mit
---
|
dmrau/cqudubstack-webmasters | 2023-10-09T12:21:13.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 34792
num_examples: 506
- name: corpus
num_bytes: 11659413
num_examples: 17405
download_size: 6885106
dataset_size: 11694205
---
# Dataset Card for "cqudubstack-webmasters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadubstack-webmasters-qrels | 2023-10-09T12:21:14.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 35771
num_examples: 1395
download_size: 16248
dataset_size: 35771
---
# Dataset Card for "cqadubstack-webmasters-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudubstack-english | 2023-10-09T12:21:27.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 103588
num_examples: 1570
- name: corpus
num_bytes: 18199570
num_examples: 40221
download_size: 11382247
dataset_size: 18303158
---
# Dataset Card for "cqudubstack-english"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadubstack-english-qrels | 2023-10-09T12:21:28.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 100171
num_examples: 3765
download_size: 45031
dataset_size: 100171
---
# Dataset Card for "cqadubstack-english-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudubstack-gis | 2023-10-09T12:21:37.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 61244
num_examples: 885
- name: corpus
num_bytes: 36704924
num_examples: 37637
download_size: 20083359
dataset_size: 36766168
---
# Dataset Card for "cqudubstack-gis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadubstack-gis-qrels | 2023-10-09T12:21:39.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 28952
num_examples: 1114
download_size: 17234
dataset_size: 28952
---
# Dataset Card for "cqadubstack-gis-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudubstack-physics | 2023-10-09T12:21:48.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 73255
num_examples: 1039
- name: corpus
num_bytes: 29949928
num_examples: 38316
download_size: 17827262
dataset_size: 30023183
---
# Dataset Card for "cqudubstack-physics"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadubstack-physics-qrels | 2023-10-09T12:21:50.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 50809
num_examples: 1933
download_size: 25022
dataset_size: 50809
---
# Dataset Card for "cqadubstack-physics-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudubstack-stats | 2023-10-09T12:21:59.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 47795
num_examples: 652
- name: corpus
num_bytes: 42923933
num_examples: 42269
download_size: 24679799
dataset_size: 42971728
---
# Dataset Card for "cqudubstack-stats"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadubstack-stats-qrels | 2023-10-09T12:22:01.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 23665
num_examples: 913
download_size: 13316
dataset_size: 23665
---
# Dataset Card for "cqadubstack-stats-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudubstack-unix | 2023-10-09T12:22:11.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 72357
num_examples: 1072
- name: corpus
num_bytes: 46102756
num_examples: 47382
download_size: 24571026
dataset_size: 46175113
---
# Dataset Card for "cqudubstack-unix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadubstack-unix-qrels | 2023-10-09T12:22:13.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 44636
num_examples: 1693
download_size: 23577
dataset_size: 44636
---
# Dataset Card for "cqadubstack-unix-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudubstack-wordpress | 2023-10-09T12:22:20.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 35736
num_examples: 541
- name: corpus
num_bytes: 53026140
num_examples: 48605
download_size: 26551471
dataset_size: 53061876
---
# Dataset Card for "cqudubstack-wordpress"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadubstack-wordpress-qrels | 2023-10-09T12:22:21.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 19885
num_examples: 744
download_size: 11490
dataset_size: 19885
---
# Dataset Card for "cqadubstack-wordpress-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Open-Orca__Mistral-7B-OpenOrca | 2023-10-09T12:29:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Open-Orca/Mistral-7B-OpenOrca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Open-Orca/Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Open-Orca__Mistral-7B-OpenOrca\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T12:28:38.184371](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__Mistral-7B-OpenOrca/blob/main/results_2023-10-09T12-28-38.184371.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6226382738018685,\n\
\ \"acc_stderr\": 0.03333085490919985,\n \"acc_norm\": 0.6264402611442766,\n\
\ \"acc_norm_stderr\": 0.03330796684085299,\n \"mc1\": 0.35862913096695226,\n\
\ \"mc1_stderr\": 0.016789289499502025,\n \"mc2\": 0.530457680483314,\n\
\ \"mc2_stderr\": 0.015449176637019318\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6083617747440273,\n \"acc_stderr\": 0.014264122124938211,\n\
\ \"acc_norm\": 0.6407849829351536,\n \"acc_norm_stderr\": 0.014020224155839157\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6479784903405696,\n\
\ \"acc_stderr\": 0.004766245539606633,\n \"acc_norm\": 0.8398725353515236,\n\
\ \"acc_norm_stderr\": 0.0036597474762410575\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.0253795249107784,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.0253795249107784\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306446,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306446\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.024433016466052462,\n\
\ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.024433016466052462\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \
\ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295824,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295824\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.01377869377846407,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.01377869377846407\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879706,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879706\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3307262569832402,\n\
\ \"acc_stderr\": 0.01573502625896612,\n \"acc_norm\": 0.3307262569832402,\n\
\ \"acc_norm_stderr\": 0.01573502625896612\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.02664327847450875,\n\
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.02664327847450875\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n\
\ \"acc_stderr\": 0.01273110279050452,\n \"acc_norm\": 0.46088657105606257,\n\
\ \"acc_norm_stderr\": 0.01273110279050452\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928007,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928007\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854125,\n \
\ \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854125\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n\
\ \"mc1_stderr\": 0.016789289499502025,\n \"mc2\": 0.530457680483314,\n\
\ \"mc2_stderr\": 0.015449176637019318\n }\n}\n```"
repo_url: https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-28-38.184371.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- results_2023-10-09T12-28-38.184371.parquet
- split: latest
path:
- results_2023-10-09T12-28-38.184371.parquet
---
# Dataset Card for Evaluation run of Open-Orca/Mistral-7B-OpenOrca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Open-Orca/Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Open-Orca__Mistral-7B-OpenOrca",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T12:28:38.184371](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__Mistral-7B-OpenOrca/blob/main/results_2023-10-09T12-28-38.184371.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6226382738018685,
"acc_stderr": 0.03333085490919985,
"acc_norm": 0.6264402611442766,
"acc_norm_stderr": 0.03330796684085299,
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502025,
"mc2": 0.530457680483314,
"mc2_stderr": 0.015449176637019318
},
"harness|arc:challenge|25": {
"acc": 0.6083617747440273,
"acc_stderr": 0.014264122124938211,
"acc_norm": 0.6407849829351536,
"acc_norm_stderr": 0.014020224155839157
},
"harness|hellaswag|10": {
"acc": 0.6479784903405696,
"acc_stderr": 0.004766245539606633,
"acc_norm": 0.8398725353515236,
"acc_norm_stderr": 0.0036597474762410575
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.0253795249107784,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.0253795249107784
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932022,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932022
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306446,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306446
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.024433016466052462,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.024433016466052462
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295824,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295824
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.01377869377846407,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.01377869377846407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879706,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879706
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3307262569832402,
"acc_stderr": 0.01573502625896612,
"acc_norm": 0.3307262569832402,
"acc_norm_stderr": 0.01573502625896612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.02664327847450875,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.02664327847450875
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.01273110279050452,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.01273110279050452
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928007,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928007
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854125,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854125
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502025,
"mc2": 0.530457680483314,
"mc2_stderr": 0.015449176637019318
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_PulsarAI__Nebula-7B | 2023-10-09T12:31:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PulsarAI/Nebula-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PulsarAI/Nebula-7B](https://huggingface.co/PulsarAI/Nebula-7B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__Nebula-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T12:29:36.965037](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Nebula-7B/blob/main/results_2023-10-09T12-29-36.965037.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.570596346471807,\n\
\ \"acc_stderr\": 0.034371584431446715,\n \"acc_norm\": 0.5748599572103322,\n\
\ \"acc_norm_stderr\": 0.03434950734212607,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.0162380650690596,\n \"mc2\": 0.45561649492894496,\n\
\ \"mc2_stderr\": 0.014644899277894422\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5418088737201365,\n \"acc_stderr\": 0.0145602203087147,\n\
\ \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.014356399418009121\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6342362079267079,\n\
\ \"acc_stderr\": 0.004806593424942265,\n \"acc_norm\": 0.8345947022505477,\n\
\ \"acc_norm_stderr\": 0.0037078660457296048\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006715,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006715\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936336,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936336\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.02501074911613759,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.02501074911613759\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.043902592653775635,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.043902592653775635\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.027104826328100944,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.027104826328100944\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091707,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091707\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245282,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245282\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.02529460802398647,\n \
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.02529460802398647\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094527,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094527\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803064,\n \"\
acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7156862745098039,\n \"acc_stderr\": 0.031660096793998116,\n \"\
acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.031660096793998116\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591311,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591311\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503947,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503947\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335442,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335442\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.014283378044296422,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.014283378044296422\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2659217877094972,\n\
\ \"acc_stderr\": 0.014776765066438885,\n \"acc_norm\": 0.2659217877094972,\n\
\ \"acc_norm_stderr\": 0.014776765066438885\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302888,\n\
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302888\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.02965823509766691,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.02965823509766691\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4406779661016949,\n\
\ \"acc_stderr\": 0.012680037994097065,\n \"acc_norm\": 0.4406779661016949,\n\
\ \"acc_norm_stderr\": 0.012680037994097065\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.03025437257397671,\n\
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.03025437257397671\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5980392156862745,\n \"acc_stderr\": 0.019835176484375383,\n \
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.019835176484375383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5142857142857142,\n \"acc_stderr\": 0.03199615232806286,\n\
\ \"acc_norm\": 0.5142857142857142,\n \"acc_norm_stderr\": 0.03199615232806286\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348377,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348377\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.0162380650690596,\n \"mc2\": 0.45561649492894496,\n\
\ \"mc2_stderr\": 0.014644899277894422\n }\n}\n```"
repo_url: https://huggingface.co/PulsarAI/Nebula-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-29-36.965037.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- results_2023-10-09T12-29-36.965037.parquet
- split: latest
path:
- results_2023-10-09T12-29-36.965037.parquet
---
# Dataset Card for Evaluation run of PulsarAI/Nebula-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/Nebula-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/Nebula-7B](https://huggingface.co/PulsarAI/Nebula-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__Nebula-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T12:29:36.965037](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Nebula-7B/blob/main/results_2023-10-09T12-29-36.965037.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.570596346471807,
"acc_stderr": 0.034371584431446715,
"acc_norm": 0.5748599572103322,
"acc_norm_stderr": 0.03434950734212607,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.0162380650690596,
"mc2": 0.45561649492894496,
"mc2_stderr": 0.014644899277894422
},
"harness|arc:challenge|25": {
"acc": 0.5418088737201365,
"acc_stderr": 0.0145602203087147,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.014356399418009121
},
"harness|hellaswag|10": {
"acc": 0.6342362079267079,
"acc_stderr": 0.004806593424942265,
"acc_norm": 0.8345947022505477,
"acc_norm_stderr": 0.0037078660457296048
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006715,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006715
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.02501074911613759,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.02501074911613759
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.043902592653775635,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.043902592653775635
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.027104826328100944,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.027104826328100944
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091707,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091707
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245282,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245282
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.02529460802398647,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.02529460802398647
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.02742001935094527,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.02742001935094527
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7743119266055046,
"acc_stderr": 0.017923087667803064,
"acc_norm": 0.7743119266055046,
"acc_norm_stderr": 0.017923087667803064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.031660096793998116,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.031660096793998116
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591311,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591311
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503947,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503947
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335442,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335442
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296422,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296422
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2659217877094972,
"acc_stderr": 0.014776765066438885,
"acc_norm": 0.2659217877094972,
"acc_norm_stderr": 0.014776765066438885
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302888,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302888
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488544,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488544
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900926,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.02965823509766691,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.02965823509766691
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4406779661016949,
"acc_stderr": 0.012680037994097065,
"acc_norm": 0.4406779661016949,
"acc_norm_stderr": 0.012680037994097065
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.03025437257397671,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.03025437257397671
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.019835176484375383,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.019835176484375383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5142857142857142,
"acc_stderr": 0.03199615232806286,
"acc_norm": 0.5142857142857142,
"acc_norm_stderr": 0.03199615232806286
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348377,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348377
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.0162380650690596,
"mc2": 0.45561649492894496,
"mc2_stderr": 0.014644899277894422
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
promptora11/finetune | 2023-10-09T12:38:42.000Z | [
"region:us"
] | promptora11 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 74044
num_examples: 100
download_size: 7883
dataset_size: 74044
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "finetune"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudupstack-android | 2023-10-09T12:36:34.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 47953
num_examples: 699
- name: corpus
num_bytes: 12840959
num_examples: 22998
download_size: 7657118
dataset_size: 12888912
---
# Dataset Card for "cqudupstack-android"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudupstack-gaming | 2023-10-09T12:36:47.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 105494
num_examples: 1595
- name: corpus
num_bytes: 20666596
num_examples: 45301
download_size: 12946080
dataset_size: 20772090
---
# Dataset Card for "cqudupstack-gaming"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudupstack-mathematica | 2023-10-09T12:36:57.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 52792
num_examples: 804
- name: corpus
num_bytes: 18735825
num_examples: 16705
download_size: 10393860
dataset_size: 18788617
---
# Dataset Card for "cqudupstack-mathematica"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudupstack-programmers | 2023-10-09T12:37:07.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 63785
num_examples: 876
- name: corpus
num_bytes: 32727262
num_examples: 32176
download_size: 19360000
dataset_size: 32791047
---
# Dataset Card for "cqudupstack-programmers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B | 2023-10-09T12:38:09.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Weyaxi/Samantha-Nebula-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Samantha-Nebula-7B](https://huggingface.co/Weyaxi/Samantha-Nebula-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T12:36:46.129297](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B/blob/main/results_2023-10-09T12-36-46.129297.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5434029511726591,\n\
\ \"acc_stderr\": 0.03489878946352031,\n \"acc_norm\": 0.5473439940952133,\n\
\ \"acc_norm_stderr\": 0.03487982159376023,\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.01676379072844633,\n \"mc2\": 0.49583823304450064,\n\
\ \"mc2_stderr\": 0.015451218745335026\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5264505119453925,\n \"acc_stderr\": 0.014590931358120169,\n\
\ \"acc_norm\": 0.5699658703071673,\n \"acc_norm_stderr\": 0.014467631559137991\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6335391356303525,\n\
\ \"acc_stderr\": 0.004808526802718585,\n \"acc_norm\": 0.8225453096992631,\n\
\ \"acc_norm_stderr\": 0.0038127222858557754\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.569811320754717,\n \"acc_stderr\": 0.030471445867183235,\n\
\ \"acc_norm\": 0.569811320754717,\n \"acc_norm_stderr\": 0.030471445867183235\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045105,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045105\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057082,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057082\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6258064516129033,\n \"acc_stderr\": 0.027528904299845686,\n \"\
acc_norm\": 0.6258064516129033,\n \"acc_norm_stderr\": 0.027528904299845686\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"\
acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.03318477333845331,\n \"\
acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.03318477333845331\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.029252823291803627,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.029252823291803627\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5230769230769231,\n \"acc_stderr\": 0.025323990861736242,\n\
\ \"acc_norm\": 0.5230769230769231,\n \"acc_norm_stderr\": 0.025323990861736242\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5336134453781513,\n \"acc_stderr\": 0.03240501447690071,\n \
\ \"acc_norm\": 0.5336134453781513,\n \"acc_norm_stderr\": 0.03240501447690071\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3611111111111111,\n \"acc_stderr\": 0.032757734861009996,\n \"\
acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.032757734861009996\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6862745098039216,\n \"acc_stderr\": 0.032566854844603886,\n \"\
acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.032566854844603886\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7046413502109705,\n \"acc_stderr\": 0.02969633871342288,\n \
\ \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.02969633871342288\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.043564472026650695,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.043564472026650695\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978815,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978815\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.045723723587374296,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.045723723587374296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.046561471100123514,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.046561471100123514\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7586206896551724,\n\
\ \"acc_stderr\": 0.015302380123542103,\n \"acc_norm\": 0.7586206896551724,\n\
\ \"acc_norm_stderr\": 0.015302380123542103\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.026296227915613674,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.026296227915613674\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2335195530726257,\n\
\ \"acc_stderr\": 0.014149575348976266,\n \"acc_norm\": 0.2335195530726257,\n\
\ \"acc_norm_stderr\": 0.014149575348976266\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809068,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809068\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.02673062072800491,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.02673062072800491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.026517597724465013,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.026517597724465013\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.029525914302558562,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.029525914302558562\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41460234680573665,\n\
\ \"acc_stderr\": 0.012582597058908284,\n \"acc_norm\": 0.41460234680573665,\n\
\ \"acc_norm_stderr\": 0.012582597058908284\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5784313725490197,\n \"acc_stderr\": 0.019977422600227477,\n \
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.019977422600227477\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n\
\ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.031343283582089536,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.031343283582089536\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.038913644958358175,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.038913644958358175\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.01676379072844633,\n \"mc2\": 0.49583823304450064,\n\
\ \"mc2_stderr\": 0.015451218745335026\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Samantha-Nebula-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-36-46.129297.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- results_2023-10-09T12-36-46.129297.parquet
- split: latest
path:
- results_2023-10-09T12-36-46.129297.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Samantha-Nebula-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/Samantha-Nebula-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/Samantha-Nebula-7B](https://huggingface.co/Weyaxi/Samantha-Nebula-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T12:36:46.129297](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B/blob/main/results_2023-10-09T12-36-46.129297.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5434029511726591,
"acc_stderr": 0.03489878946352031,
"acc_norm": 0.5473439940952133,
"acc_norm_stderr": 0.03487982159376023,
"mc1": 0.3561811505507956,
"mc1_stderr": 0.01676379072844633,
"mc2": 0.49583823304450064,
"mc2_stderr": 0.015451218745335026
},
"harness|arc:challenge|25": {
"acc": 0.5264505119453925,
"acc_stderr": 0.014590931358120169,
"acc_norm": 0.5699658703071673,
"acc_norm_stderr": 0.014467631559137991
},
"harness|hellaswag|10": {
"acc": 0.6335391356303525,
"acc_stderr": 0.004808526802718585,
"acc_norm": 0.8225453096992631,
"acc_norm_stderr": 0.0038127222858557754
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.569811320754717,
"acc_stderr": 0.030471445867183235,
"acc_norm": 0.569811320754717,
"acc_norm_stderr": 0.030471445867183235
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045105,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045105
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.024870815251057082,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.024870815251057082
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845686,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845686
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.029252823291803627,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.029252823291803627
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5230769230769231,
"acc_stderr": 0.025323990861736242,
"acc_norm": 0.5230769230769231,
"acc_norm_stderr": 0.025323990861736242
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5336134453781513,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.5336134453781513,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.032757734861009996,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.032757734861009996
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.032566854844603886,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.032566854844603886
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.02969633871342288,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.02969633871342288
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.043564472026650695,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.043564472026650695
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978815,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978815
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.045723723587374296,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.045723723587374296
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.046561471100123514,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.046561471100123514
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7586206896551724,
"acc_stderr": 0.015302380123542103,
"acc_norm": 0.7586206896551724,
"acc_norm_stderr": 0.015302380123542103
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.026296227915613674,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.026296227915613674
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2335195530726257,
"acc_stderr": 0.014149575348976266,
"acc_norm": 0.2335195530726257,
"acc_norm_stderr": 0.014149575348976266
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809068,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809068
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.02673062072800491,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.02673062072800491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.026517597724465013,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.026517597724465013
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.029525914302558562,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.029525914302558562
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41460234680573665,
"acc_stderr": 0.012582597058908284,
"acc_norm": 0.41460234680573665,
"acc_norm_stderr": 0.012582597058908284
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.019977422600227477,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.019977422600227477
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.03198761546763127,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.03198761546763127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.031343283582089536,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.031343283582089536
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.038913644958358175,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.038913644958358175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3561811505507956,
"mc1_stderr": 0.01676379072844633,
"mc2": 0.49583823304450064,
"mc2_stderr": 0.015451218745335026
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dmrau/cqudupstack-tex | 2023-10-09T12:37:57.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 186934
num_examples: 2906
- name: corpus
num_bytes: 86600423
num_examples: 68184
download_size: 43424126
dataset_size: 86787357
---
# Dataset Card for "cqudupstack-tex"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudupstack-webmasters | 2023-10-09T12:38:04.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 34792
num_examples: 506
- name: corpus
num_bytes: 11659413
num_examples: 17405
download_size: 6885106
dataset_size: 11694205
---
# Dataset Card for "cqudupstack-webmasters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudupstack-english | 2023-10-09T12:38:18.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 103588
num_examples: 1570
- name: corpus
num_bytes: 18199570
num_examples: 40221
download_size: 11382247
dataset_size: 18303158
---
# Dataset Card for "cqudupstack-english"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudupstack-gis | 2023-10-09T12:38:30.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 61244
num_examples: 885
- name: corpus
num_bytes: 36704924
num_examples: 37637
download_size: 20083359
dataset_size: 36766168
---
# Dataset Card for "cqudupstack-gis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudupstack-physics | 2023-10-09T12:38:36.000Z | [
"region:us"
] | dmrau | null | null | null | 0 | 0 | Entry not found |
harinarayan/my_final_dataset | 2023-10-09T12:39:16.000Z | [
"region:us"
] | harinarayan | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 472226.0
num_examples: 33
download_size: 471990
dataset_size: 472226.0
---
# Dataset Card for "my_final_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Praghxx/Ricky | 2023-10-09T12:48:52.000Z | [
"license:openrail",
"region:us"
] | Praghxx | null | null | null | 0 | 0 | ---
license: openrail
---
|
Shruti9756/contract-nli-reformat-rlhf | 2023-10-09T12:49:16.000Z | [
"region:us"
] | Shruti9756 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_akjindal53244__Mistral-7B-v0.1-Open-Platypus | 2023-10-09T12:54:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of akjindal53244/Mistral-7B-v0.1-Open-Platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [akjindal53244/Mistral-7B-v0.1-Open-Platypus](https://huggingface.co/akjindal53244/Mistral-7B-v0.1-Open-Platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_akjindal53244__Mistral-7B-v0.1-Open-Platypus\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T12:52:41.880840](https://huggingface.co/datasets/open-llm-leaderboard/details_akjindal53244__Mistral-7B-v0.1-Open-Platypus/blob/main/results_2023-10-09T12-52-41.880840.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6373475749467716,\n\
\ \"acc_stderr\": 0.03299569980721215,\n \"acc_norm\": 0.6412780300710569,\n\
\ \"acc_norm_stderr\": 0.03297156521536439,\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.01640398946990783,\n \"mc2\": 0.4732846266548936,\n\
\ \"mc2_stderr\": 0.015063591761555577\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.01440561827943617,\n\
\ \"acc_norm\": 0.6237201365187713,\n \"acc_norm_stderr\": 0.014157022555407158\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6590320653256323,\n\
\ \"acc_stderr\": 0.004730658073041557,\n \"acc_norm\": 0.8508265285799641,\n\
\ \"acc_norm_stderr\": 0.003555312878052388\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621502,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621502\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432108,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432108\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7225806451612903,\n \"acc_stderr\": 0.025470196835900055,\n \"\
acc_norm\": 0.7225806451612903,\n \"acc_norm_stderr\": 0.025470196835900055\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124498,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124498\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.024635549163908234,\n \
\ \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.024635549163908234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295838,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295838\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n\
\ \"acc_stderr\": 0.029918586707798827,\n \"acc_norm\": 0.726457399103139,\n\
\ \"acc_norm_stderr\": 0.029918586707798827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03826076324884863,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03826076324884863\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35307262569832404,\n\
\ \"acc_stderr\": 0.015984204545268565,\n \"acc_norm\": 0.35307262569832404,\n\
\ \"acc_norm_stderr\": 0.015984204545268565\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188943,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188943\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49869621903520206,\n\
\ \"acc_stderr\": 0.012770192691057116,\n \"acc_norm\": 0.49869621903520206,\n\
\ \"acc_norm_stderr\": 0.012770192691057116\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528176,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528176\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.01640398946990783,\n \"mc2\": 0.4732846266548936,\n\
\ \"mc2_stderr\": 0.015063591761555577\n }\n}\n```"
repo_url: https://huggingface.co/akjindal53244/Mistral-7B-v0.1-Open-Platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-52-41.880840.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- results_2023-10-09T12-52-41.880840.parquet
- split: latest
path:
- results_2023-10-09T12-52-41.880840.parquet
---
# Dataset Card for Evaluation run of akjindal53244/Mistral-7B-v0.1-Open-Platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/akjindal53244/Mistral-7B-v0.1-Open-Platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [akjindal53244/Mistral-7B-v0.1-Open-Platypus](https://huggingface.co/akjindal53244/Mistral-7B-v0.1-Open-Platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_akjindal53244__Mistral-7B-v0.1-Open-Platypus",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T12:52:41.880840](https://huggingface.co/datasets/open-llm-leaderboard/details_akjindal53244__Mistral-7B-v0.1-Open-Platypus/blob/main/results_2023-10-09T12-52-41.880840.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6373475749467716,
"acc_stderr": 0.03299569980721215,
"acc_norm": 0.6412780300710569,
"acc_norm_stderr": 0.03297156521536439,
"mc1": 0.32558139534883723,
"mc1_stderr": 0.01640398946990783,
"mc2": 0.4732846266548936,
"mc2_stderr": 0.015063591761555577
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.01440561827943617,
"acc_norm": 0.6237201365187713,
"acc_norm_stderr": 0.014157022555407158
},
"harness|hellaswag|10": {
"acc": 0.6590320653256323,
"acc_stderr": 0.004730658073041557,
"acc_norm": 0.8508265285799641,
"acc_norm_stderr": 0.003555312878052388
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621502,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621502
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432108,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432108
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124498,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124498
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.617948717948718,
"acc_stderr": 0.024635549163908234,
"acc_norm": 0.617948717948718,
"acc_norm_stderr": 0.024635549163908234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646507,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646507
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295838,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295838
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243741,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.029918586707798827,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.029918586707798827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884863,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884863
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35307262569832404,
"acc_stderr": 0.015984204545268565,
"acc_norm": 0.35307262569832404,
"acc_norm_stderr": 0.015984204545268565
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188943,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49869621903520206,
"acc_stderr": 0.012770192691057116,
"acc_norm": 0.49869621903520206,
"acc_norm_stderr": 0.012770192691057116
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528176,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528176
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32558139534883723,
"mc1_stderr": 0.01640398946990783,
"mc2": 0.4732846266548936,
"mc2_stderr": 0.015063591761555577
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DjangoDeng/llama2-chinese | 2023-10-09T12:56:24.000Z | [
"license:mit",
"region:us"
] | DjangoDeng | null | null | null | 0 | 0 | ---
license: mit
---
|
open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B | 2023-10-09T13:05:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B](https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T13:03:57.822479](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B/blob/main/results_2023-10-09T13-03-57.822479.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6403621776118597,\n\
\ \"acc_stderr\": 0.0328820699921915,\n \"acc_norm\": 0.6444418901610861,\n\
\ \"acc_norm_stderr\": 0.03285864326132893,\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.46942157956807357,\n\
\ \"mc2_stderr\": 0.01469732561783109\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.014409825518403079,\n\
\ \"acc_norm\": 0.6220136518771331,\n \"acc_norm_stderr\": 0.014169664520303098\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6395140410276837,\n\
\ \"acc_stderr\": 0.004791601975612765,\n \"acc_norm\": 0.8409679346743677,\n\
\ \"acc_norm_stderr\": 0.003649585852821188\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062153,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062153\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431385,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431385\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128139,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128139\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899126,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899126\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31731843575418994,\n\
\ \"acc_stderr\": 0.01556639263005703,\n \"acc_norm\": 0.31731843575418994,\n\
\ \"acc_norm_stderr\": 0.01556639263005703\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n\
\ \"acc_stderr\": 0.012719949543032207,\n \"acc_norm\": 0.4556714471968709,\n\
\ \"acc_norm_stderr\": 0.012719949543032207\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.46942157956807357,\n\
\ \"mc2_stderr\": 0.01469732561783109\n }\n}\n```"
repo_url: https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|arc:challenge|25_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hellaswag|10_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T13-03-57.822479.parquet'
- config_name: results
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- results_2023-10-09T13-03-57.822479.parquet
- split: latest
path:
- results_2023-10-09T13-03-57.822479.parquet
---
# Dataset Card for Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B](https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T13:03:57.822479](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B/blob/main/results_2023-10-09T13-03-57.822479.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6403621776118597,
"acc_stderr": 0.0328820699921915,
"acc_norm": 0.6444418901610861,
"acc_norm_stderr": 0.03285864326132893,
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559693,
"mc2": 0.46942157956807357,
"mc2_stderr": 0.01469732561783109
},
"harness|arc:challenge|25": {
"acc": 0.5827645051194539,
"acc_stderr": 0.014409825518403079,
"acc_norm": 0.6220136518771331,
"acc_norm_stderr": 0.014169664520303098
},
"harness|hellaswag|10": {
"acc": 0.6395140410276837,
"acc_stderr": 0.004791601975612765,
"acc_norm": 0.8409679346743677,
"acc_norm_stderr": 0.003649585852821188
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062153,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062153
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431385,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431385
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128139,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128139
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899126,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899126
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31731843575418994,
"acc_stderr": 0.01556639263005703,
"acc_norm": 0.31731843575418994,
"acc_norm_stderr": 0.01556639263005703
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032207,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032207
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559693,
"mc2": 0.46942157956807357,
"mc2_stderr": 0.01469732561783109
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nitinbhayana/review-phrases-sentiments-v2 | 2023-10-09T13:09:06.000Z | [
"region:us"
] | nitinbhayana | null | null | null | 0 | 0 | Entry not found |
Rootreck/so-vits-svc-4.0-Fallout_4 | 2023-10-09T15:30:27.000Z | [
"region:us"
] | Rootreck | null | null | null | 0 | 0 | Entry not found |
nitinbhayana/keyword-category-brand-v1 | 2023-10-09T13:11:05.000Z | [
"region:us"
] | nitinbhayana | null | null | null | 0 | 0 | Entry not found |
harinarayan/my_newest_dataset | 2023-10-09T13:22:45.000Z | [
"region:us"
] | harinarayan | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1558328.0
num_examples: 36
download_size: 1436147
dataset_size: 1558328.0
---
# Dataset Card for "my_newest_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LoliUsa/nva-xxx2 | 2023-10-09T13:23:42.000Z | [
"region:us"
] | LoliUsa | null | null | null | 0 | 0 | Entry not found |
ninjawick/cagliostro-colab-ui | 2023-10-09T13:25:44.000Z | [
"region:us"
] | ninjawick | null | null | null | 0 | 0 | Entry not found |
haiffy/Survey_Faktor_Membeli_HP | 2023-10-09T13:32:26.000Z | [
"task_categories:text-classification",
"size_categories:1K<n<10K",
"language:id",
"finance",
"region:us"
] | haiffy | null | null | null | 0 | 0 | ---
task_categories:
- text-classification
language:
- id
tags:
- finance
pretty_name: keputusan_beli_hp
size_categories:
- 1K<n<10K
--- |
chongli17/AG-Pair | 2023-10-09T13:53:52.000Z | [
"task_categories:text-classification",
"size_categories:100K<n<1M",
"language:en",
"license:unknown",
"region:us"
] | chongli17 | null | null | null | 0 | 0 | ---
license: unknown
task_categories:
- text-classification
language:
- en
size_categories:
- 100K<n<1M
---
## Dataset Creation
The AG-Pair dataset is built from the original dataset AG's News that contains 120k training samples from four topics.
Given a pair of news as input, the model has to predict whether they are belonging to the same topic (Same) or not (Different).
To generate this dataset, samples in AG are iterated in random order and have an equal chance to combine a sample in the same topic or the other three topics.
Thus the numbers of training samples in two classes are both 60k.
Moreover, each news in AG's News occurs exactly twice in the AG-Pair dataset to keep the same word frequency.
## Additional Information
### Dataset Curators
[Chong Li](lichong2021@ia.ac.cn)
### Citation Information
```
@inproceedings{li-etal-2023-FunctionalSpecialization,
author = {Chong Li and
Shaonan Wang and
Yunhao Zhang and
Jiajun Zhang and
Chengqing Zong},
title = "Interpreting and Exploiting Functional Specialization in Multi-Head Attention under Multi-task Learning",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
}
```
|
harinarayan/my_small_dataset | 2023-10-09T13:50:48.000Z | [
"region:us"
] | harinarayan | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 445121.0
num_examples: 8
download_size: 417058
dataset_size: 445121.0
---
# Dataset Card for "my_small_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.