id
stringlengths
2
115
lastModified
stringlengths
24
24
tags
list
author
stringlengths
2
42
description
stringlengths
0
68.7k
citation
stringlengths
0
10.7k
cardData
null
likes
int64
0
3.55k
downloads
int64
0
10.1M
card
stringlengths
0
1.01M
JonnyHsu/testtest
2023-09-04T17:34:50.000Z
[ "license:openrail", "region:us" ]
JonnyHsu
null
null
null
0
0
--- license: openrail ---
dariolopez/Llama-2-databricks-dolly-oasst1-es-lower-2048-tokens
2023-09-04T17:48:37.000Z
[ "region:us" ]
dariolopez
null
null
null
0
0
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 18176970.43051152 num_examples: 18817 download_size: 9907243 dataset_size: 18176970.43051152 --- # Dataset Card for "Llama-2-databricks-dolly-oasst1-es-lower-2048-tokens" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dariolopez/Llama-2-databricks-dolly-oasst1-es-lower-1024-tokens
2023-09-04T17:48:41.000Z
[ "region:us" ]
dariolopez
null
null
null
0
0
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 17737446.513527796 num_examples: 18362 download_size: 8772119 dataset_size: 17737446.513527796 --- # Dataset Card for "Llama-2-databricks-dolly-oasst1-es-lower-1024-tokens" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dariolopez/Llama-2-databricks-dolly-oasst1-es-lower-512-tokens
2023-09-04T17:48:07.000Z
[ "region:us" ]
dariolopez
null
null
null
0
0
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 15747514.054216867 num_examples: 16302 download_size: 6110375 dataset_size: 15747514.054216867 --- # Dataset Card for "Llama-2-databricks-dolly-oasst1-es-lower-512-tokens" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bjoernp/oasst25-08-23-filtered
2023-09-04T17:46:35.000Z
[ "region:us" ]
bjoernp
null
null
null
0
0
--- dataset_info: features: - name: conversation list: - name: context dtype: 'null' - name: creativity dtype: float64 - name: humor dtype: float64 - name: lang dtype: string - name: quality dtype: float64 - name: role dtype: string - name: text dtype: string - name: system_message dtype: 'null' splits: - name: train num_bytes: 17152145.58826024 num_examples: 9105 download_size: 9881270 dataset_size: 17152145.58826024 --- # Dataset Card for "oasst25-08-23-filtered" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Ohwang/Nuclear_tweets_12k
2023-09-04T18:19:49.000Z
[ "region:us" ]
Ohwang
null
null
null
0
0
--- dataset_info: config_name: Nuclear tweets(12k) features: - name: tweets dtype: string - name: Label dtype: string splits: - name: train num_bytes: 2991435 num_examples: 12091 download_size: 0 dataset_size: 2991435 configs: - config_name: Nuclear tweets(12k) data_files: - split: train path: Nuclear tweets(12k)/train-* --- # Dataset Card for "Nuclear_tweets_12k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Hothan/tem
2023-09-04T18:20:18.000Z
[ "region:us" ]
Hothan
null
null
null
0
0
Entry not found
JWBickel/BibleDictionaries
2023-09-04T18:16:04.000Z
[ "region:us" ]
JWBickel
null
null
null
2
0
Entry not found
DrRaja82/test_dataset
2023-09-05T11:59:17.000Z
[ "region:us" ]
DrRaja82
null
null
null
0
0
Entry not found
siddanshchawla/paysim_processed
2023-09-04T18:49:22.000Z
[ "region:us" ]
siddanshchawla
null
null
null
0
0
Entry not found
marasama/nva-yaeko
2023-09-04T18:42:54.000Z
[ "region:us" ]
marasama
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v6
2023-09-04T18:45:26.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of yeontaek/llama-2-70B-ensemble-v6 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [yeontaek/llama-2-70B-ensemble-v6](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v6)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v6\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-09-04T18:44:04.025476](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v6/blob/main/results_2023-09-04T18%3A44%3A04.025476.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6807030718501101,\n\ \ \"acc_stderr\": 0.03172804575089014,\n \"acc_norm\": 0.6844122287414298,\n\ \ \"acc_norm_stderr\": 0.03169926697255189,\n \"mc1\": 0.43818849449204406,\n\ \ \"mc1_stderr\": 0.017369236164404445,\n \"mc2\": 0.624345044166297,\n\ \ \"mc2_stderr\": 0.014991862964877591\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6774744027303754,\n \"acc_stderr\": 0.013659980894277373,\n\ \ \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520767\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6856203943437562,\n\ \ \"acc_stderr\": 0.004633194825793846,\n \"acc_norm\": 0.8720374427404899,\n\ \ \"acc_norm_stderr\": 0.003333654120593691\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\ \ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\ \ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343604,\n\ \ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343604\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\ \ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\ \ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\ \ \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n\ \ \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \ \ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\ \ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\ \ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\ \ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.03141082197596241,\n\ \ \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.03141082197596241\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\ \ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\ \ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\ \ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.4603174603174603,\n \"acc_stderr\": 0.02567008063690919,\n \"\ acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.02567008063690919\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\ \ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \ \ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.8161290322580645,\n \"acc_stderr\": 0.022037217340267833,\n \"\ acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.022037217340267833\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"\ acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\ : 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\ \ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8585858585858586,\n \"acc_stderr\": 0.02482590979334332,\n \"\ acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.02482590979334332\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\ \ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.023400928918310495,\n\ \ \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.023400928918310495\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3,\n \"acc_stderr\": 0.027940457136228412,\n \"acc_norm\"\ : 0.3,\n \"acc_norm_stderr\": 0.027940457136228412\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\ : {\n \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n\ \ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"\ acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8844036697247707,\n \"acc_stderr\": 0.01370874953417264,\n \"\ acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.01370874953417264\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\ : 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\ \ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n\ \ \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n\ \ \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884562,\n\ \ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884562\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\ \ \"acc_stderr\": 0.027991534258519517,\n \"acc_norm\": 0.7757847533632287,\n\ \ \"acc_norm_stderr\": 0.027991534258519517\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462469,\n\ \ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462469\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\ acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\ \ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\ \ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n\ \ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5803571428571429,\n\ \ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.5803571428571429,\n\ \ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\ \ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\ \ \"acc_stderr\": 0.019875655027867468,\n \"acc_norm\": 0.8974358974358975,\n\ \ \"acc_norm_stderr\": 0.019875655027867468\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8607918263090677,\n\ \ \"acc_stderr\": 0.012378786101885147,\n \"acc_norm\": 0.8607918263090677,\n\ \ \"acc_norm_stderr\": 0.012378786101885147\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\ \ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6424581005586593,\n\ \ \"acc_stderr\": 0.01602939447489489,\n \"acc_norm\": 0.6424581005586593,\n\ \ \"acc_norm_stderr\": 0.01602939447489489\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\ \ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7491961414790996,\n\ \ \"acc_stderr\": 0.024619771956697168,\n \"acc_norm\": 0.7491961414790996,\n\ \ \"acc_norm_stderr\": 0.024619771956697168\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.8024691358024691,\n \"acc_stderr\": 0.022152889927898965,\n\ \ \"acc_norm\": 0.8024691358024691,\n \"acc_norm_stderr\": 0.022152889927898965\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.574468085106383,\n \"acc_stderr\": 0.029494827600144366,\n \ \ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.029494827600144366\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5671447196870926,\n\ \ \"acc_stderr\": 0.012654565234622862,\n \"acc_norm\": 0.5671447196870926,\n\ \ \"acc_norm_stderr\": 0.012654565234622862\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031218,\n\ \ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031218\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.7352941176470589,\n \"acc_stderr\": 0.01784808957491323,\n \ \ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.01784808957491323\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\ \ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\ \ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\ \ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\ \ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\ \ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \ \ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\ \ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\ \ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n\ \ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43818849449204406,\n\ \ \"mc1_stderr\": 0.017369236164404445,\n \"mc2\": 0.624345044166297,\n\ \ \"mc2_stderr\": 0.014991862964877591\n }\n}\n```" repo_url: https://huggingface.co/yeontaek/llama-2-70B-ensemble-v6 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|arc:challenge|25_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hellaswag|10_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-04T18:44:04.025476.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-management|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T18:44:04.025476.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_04T18_44_04.025476 path: - '**/details_harness|truthfulqa:mc|0_2023-09-04T18:44:04.025476.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-04T18:44:04.025476.parquet' - config_name: results data_files: - split: 2023_09_04T18_44_04.025476 path: - results_2023-09-04T18:44:04.025476.parquet - split: latest path: - results_2023-09-04T18:44:04.025476.parquet --- # Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v6 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v6 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v6](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v6", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-04T18:44:04.025476](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v6/blob/main/results_2023-09-04T18%3A44%3A04.025476.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6807030718501101, "acc_stderr": 0.03172804575089014, "acc_norm": 0.6844122287414298, "acc_norm_stderr": 0.03169926697255189, "mc1": 0.43818849449204406, "mc1_stderr": 0.017369236164404445, "mc2": 0.624345044166297, "mc2_stderr": 0.014991862964877591 }, "harness|arc:challenge|25": { "acc": 0.6774744027303754, "acc_stderr": 0.013659980894277373, "acc_norm": 0.7098976109215017, "acc_norm_stderr": 0.013261573677520767 }, "harness|hellaswag|10": { "acc": 0.6856203943437562, "acc_stderr": 0.004633194825793846, "acc_norm": 0.8720374427404899, "acc_norm_stderr": 0.003333654120593691 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.04171654161354543, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.04171654161354543 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7763157894736842, "acc_stderr": 0.03391160934343604, "acc_norm": 0.7763157894736842, "acc_norm_stderr": 0.03391160934343604 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8055555555555556, "acc_stderr": 0.03309615177059006, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.03309615177059006 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.35294117647058826, "acc_stderr": 0.04755129616062947, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.04755129616062947 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6382978723404256, "acc_stderr": 0.03141082197596241, "acc_norm": 0.6382978723404256, "acc_norm_stderr": 0.03141082197596241 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.41228070175438597, "acc_stderr": 0.04630653203366595, "acc_norm": 0.41228070175438597, "acc_norm_stderr": 0.04630653203366595 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4603174603174603, "acc_stderr": 0.02567008063690919, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.02567008063690919 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5, "acc_stderr": 0.04472135954999579, "acc_norm": 0.5, "acc_norm_stderr": 0.04472135954999579 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8161290322580645, "acc_stderr": 0.022037217340267833, "acc_norm": 0.8161290322580645, "acc_norm_stderr": 0.022037217340267833 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.03515895551165698, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.03515895551165698 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8363636363636363, "acc_stderr": 0.02888787239548795, "acc_norm": 0.8363636363636363, "acc_norm_stderr": 0.02888787239548795 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8585858585858586, "acc_stderr": 0.02482590979334332, "acc_norm": 0.8585858585858586, "acc_norm_stderr": 0.02482590979334332 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.927461139896373, "acc_stderr": 0.018718998520678178, "acc_norm": 0.927461139896373, "acc_norm_stderr": 0.018718998520678178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6923076923076923, "acc_stderr": 0.023400928918310495, "acc_norm": 0.6923076923076923, "acc_norm_stderr": 0.023400928918310495 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3, "acc_stderr": 0.027940457136228412, "acc_norm": 0.3, "acc_norm_stderr": 0.027940457136228412 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7521008403361344, "acc_stderr": 0.028047967224176892, "acc_norm": 0.7521008403361344, "acc_norm_stderr": 0.028047967224176892 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.45695364238410596, "acc_stderr": 0.04067325174247443, "acc_norm": 0.45695364238410596, "acc_norm_stderr": 0.04067325174247443 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8844036697247707, "acc_stderr": 0.01370874953417264, "acc_norm": 0.8844036697247707, "acc_norm_stderr": 0.01370874953417264 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5277777777777778, "acc_stderr": 0.0340470532865388, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9166666666666666, "acc_stderr": 0.019398452135813905, "acc_norm": 0.9166666666666666, "acc_norm_stderr": 0.019398452135813905 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8860759493670886, "acc_stderr": 0.020681745135884562, "acc_norm": 0.8860759493670886, "acc_norm_stderr": 0.020681745135884562 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7757847533632287, "acc_stderr": 0.027991534258519517, "acc_norm": 0.7757847533632287, "acc_norm_stderr": 0.027991534258519517 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462469, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8347107438016529, "acc_stderr": 0.03390780612972776, "acc_norm": 0.8347107438016529, "acc_norm_stderr": 0.03390780612972776 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.03192193448934724, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.03192193448934724 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5803571428571429, "acc_stderr": 0.04684099321077106, "acc_norm": 0.5803571428571429, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.8349514563106796, "acc_stderr": 0.03675668832233188, "acc_norm": 0.8349514563106796, "acc_norm_stderr": 0.03675668832233188 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8974358974358975, "acc_stderr": 0.019875655027867468, "acc_norm": 0.8974358974358975, "acc_norm_stderr": 0.019875655027867468 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8607918263090677, "acc_stderr": 0.012378786101885147, "acc_norm": 0.8607918263090677, "acc_norm_stderr": 0.012378786101885147 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.02378620325550829, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.02378620325550829 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.6424581005586593, "acc_stderr": 0.01602939447489489, "acc_norm": 0.6424581005586593, "acc_norm_stderr": 0.01602939447489489 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137897, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137897 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7491961414790996, "acc_stderr": 0.024619771956697168, "acc_norm": 0.7491961414790996, "acc_norm_stderr": 0.024619771956697168 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8024691358024691, "acc_stderr": 0.022152889927898965, "acc_norm": 0.8024691358024691, "acc_norm_stderr": 0.022152889927898965 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.574468085106383, "acc_stderr": 0.029494827600144366, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.029494827600144366 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5671447196870926, "acc_stderr": 0.012654565234622862, "acc_norm": 0.5671447196870926, "acc_norm_stderr": 0.012654565234622862 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.028332959514031218, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.028332959514031218 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7352941176470589, "acc_stderr": 0.01784808957491323, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.01784808957491323 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.02812342933514278, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.02812342933514278 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8756218905472637, "acc_stderr": 0.023335401790166327, "acc_norm": 0.8756218905472637, "acc_norm_stderr": 0.023335401790166327 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.847953216374269, "acc_stderr": 0.02753912288906145, "acc_norm": 0.847953216374269, "acc_norm_stderr": 0.02753912288906145 }, "harness|truthfulqa:mc|0": { "mc1": 0.43818849449204406, "mc1_stderr": 0.017369236164404445, "mc2": 0.624345044166297, "mc2_stderr": 0.014991862964877591 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
suryajg/suvin
2023-09-04T18:52:34.000Z
[ "license:openrail", "region:us" ]
suryajg
null
null
null
0
0
--- license: openrail ---
Lilsunx/suvinlee
2023-09-04T18:59:13.000Z
[ "license:openrail", "region:us" ]
Lilsunx
null
null
null
0
0
--- license: openrail ---
MSD-Team/LLMTest
2023-09-06T16:13:49.000Z
[ "region:us" ]
MSD-Team
null
null
null
0
0
jpark2111/uber
2023-09-04T19:48:30.000Z
[ "region:us" ]
jpark2111
null
null
null
0
0
Entry not found
ramsel/dataviz-sample
2023-09-04T19:24:15.000Z
[ "region:us" ]
ramsel
null
null
null
0
0
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 9022 num_examples: 11 download_size: 8204 dataset_size: 9022 --- # Dataset Card for "dataviz-sample" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tharun-6743/criket-546
2023-09-04T19:30:13.000Z
[ "license:openrail", "region:us" ]
tharun-6743
null
null
null
0
0
--- license: openrail ---
tharun-6743/tharun
2023-09-04T19:43:44.000Z
[ "license:openrail", "region:us" ]
tharun-6743
null
null
null
0
0
--- license: openrail ---
volvoDon/petrology-sections
2023-09-04T19:36:45.000Z
[ "license:afl-3.0", "region:us" ]
volvoDon
null
null
null
0
0
--- license: afl-3.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': garnet '1': kayanite '2': olivine '3': sillimanite '4': staurolite '5': titanite '6': zircon splits: - name: train num_bytes: 13556462.0 num_examples: 60 - name: test num_bytes: 2172586.0 num_examples: 10 download_size: 7727490 dataset_size: 15729048.0 ---
sngsfydy/aptos_gaussian_filtered
2023-09-04T19:51:14.000Z
[ "region:us" ]
sngsfydy
null
null
null
0
0
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': Mild '1': Moderate '2': No_DR '3': Proliferate_DR '4': Severe splits: - name: train num_bytes: 379036368.878 num_examples: 3662 download_size: 364168700 dataset_size: 379036368.878 --- # Dataset Card for "aptos_gaussian_filtered" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
maysamalfiza/diversity_statement
2023-09-04T20:14:18.000Z
[ "region:us" ]
maysamalfiza
null
null
null
0
0
diversity statement annotated from online job postings
irenecat/slenteng
2023-09-05T11:00:40.000Z
[ "region:us" ]
irenecat
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v8
2023-09-04T20:28:33.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of yeontaek/llama-2-70B-ensemble-v8 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [yeontaek/llama-2-70B-ensemble-v8](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v8)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v8\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-09-04T20:27:12.407104](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v8/blob/main/results_2023-09-04T20%3A27%3A12.407104.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6363532267919642,\n\ \ \"acc_stderr\": 0.03285197203583459,\n \"acc_norm\": 0.6397352881146252,\n\ \ \"acc_norm_stderr\": 0.03283029655087548,\n \"mc1\": 0.45165238678090575,\n\ \ \"mc1_stderr\": 0.017421480300277643,\n \"mc2\": 0.6211306316728467,\n\ \ \"mc2_stderr\": 0.01529356194952766\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6561433447098977,\n \"acc_stderr\": 0.013880644570156215,\n\ \ \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719339\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6623182632941645,\n\ \ \"acc_stderr\": 0.004719529099913132,\n \"acc_norm\": 0.8456482772356104,\n\ \ \"acc_norm_stderr\": 0.003605472116762285\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\ \ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\ \ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\ \ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\ \ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \ \ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\ \ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\ \ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \ \ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\ \ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n\ \ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\ \ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\ \ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236784,\n\ \ \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236784\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\ \ \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.3684210526315789,\n\ \ \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\ \ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"\ acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\ \ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\ \ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \ \ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\ \ \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n\ \ \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\ \ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\ : 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\ \ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603925,\n \"\ acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603925\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n\ \ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\ \ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230165,\n \ \ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230165\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n\ \ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.40397350993377484,\n \"acc_stderr\": 0.0400648568536534,\n \"\ acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.0400648568536534\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359016,\n \"\ acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359016\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\ : 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\ \ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n\ \ \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n\ \ \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n\ \ \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7443946188340808,\n\ \ \"acc_stderr\": 0.029275891003969923,\n \"acc_norm\": 0.7443946188340808,\n\ \ \"acc_norm_stderr\": 0.029275891003969923\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\ \ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\ acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\ \ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\ \ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\ \ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\ \ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\ \ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\ \ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\ \ \"acc_stderr\": 0.013547415658662257,\n \"acc_norm\": 0.8263090676883781,\n\ \ \"acc_norm_stderr\": 0.013547415658662257\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508297,\n\ \ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508297\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3687150837988827,\n\ \ \"acc_stderr\": 0.016135759015030122,\n \"acc_norm\": 0.3687150837988827,\n\ \ \"acc_norm_stderr\": 0.016135759015030122\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n\ \ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\ \ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\ \ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.02357688174400571,\n\ \ \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.02357688174400571\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \ \ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5208604954367666,\n\ \ \"acc_stderr\": 0.012759117066518008,\n \"acc_norm\": 0.5208604954367666,\n\ \ \"acc_norm_stderr\": 0.012759117066518008\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.029812630701569743,\n\ \ \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.029812630701569743\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \ \ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\ \ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\ \ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540606,\n\ \ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540606\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\ \ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\ \ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \ \ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\ \ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\ \ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\ \ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45165238678090575,\n\ \ \"mc1_stderr\": 0.017421480300277643,\n \"mc2\": 0.6211306316728467,\n\ \ \"mc2_stderr\": 0.01529356194952766\n }\n}\n```" repo_url: https://huggingface.co/yeontaek/llama-2-70B-ensemble-v8 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|arc:challenge|25_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hellaswag|10_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-04T20:27:12.407104.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-management|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T20:27:12.407104.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_04T20_27_12.407104 path: - '**/details_harness|truthfulqa:mc|0_2023-09-04T20:27:12.407104.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-04T20:27:12.407104.parquet' - config_name: results data_files: - split: 2023_09_04T20_27_12.407104 path: - results_2023-09-04T20:27:12.407104.parquet - split: latest path: - results_2023-09-04T20:27:12.407104.parquet --- # Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v8 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v8 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v8](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v8) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v8", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-04T20:27:12.407104](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v8/blob/main/results_2023-09-04T20%3A27%3A12.407104.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6363532267919642, "acc_stderr": 0.03285197203583459, "acc_norm": 0.6397352881146252, "acc_norm_stderr": 0.03283029655087548, "mc1": 0.45165238678090575, "mc1_stderr": 0.017421480300277643, "mc2": 0.6211306316728467, "mc2_stderr": 0.01529356194952766 }, "harness|arc:challenge|25": { "acc": 0.6561433447098977, "acc_stderr": 0.013880644570156215, "acc_norm": 0.6723549488054608, "acc_norm_stderr": 0.013715847940719339 }, "harness|hellaswag|10": { "acc": 0.6623182632941645, "acc_stderr": 0.004719529099913132, "acc_norm": 0.8456482772356104, "acc_norm_stderr": 0.003605472116762285 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5333333333333333, "acc_stderr": 0.043097329010363554, "acc_norm": 0.5333333333333333, "acc_norm_stderr": 0.043097329010363554 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.75, "acc_stderr": 0.03621034121889507, "acc_norm": 0.75, "acc_norm_stderr": 0.03621034121889507 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6069364161849711, "acc_stderr": 0.03724249595817731, "acc_norm": 0.6069364161849711, "acc_norm_stderr": 0.03724249595817731 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.29411764705882354, "acc_stderr": 0.04533838195929777, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.04533838195929777 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6297872340425532, "acc_stderr": 0.03156564682236784, "acc_norm": 0.6297872340425532, "acc_norm_stderr": 0.03156564682236784 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3684210526315789, "acc_stderr": 0.04537815354939391, "acc_norm": 0.3684210526315789, "acc_norm_stderr": 0.04537815354939391 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3941798941798942, "acc_stderr": 0.025167982333894143, "acc_norm": 0.3941798941798942, "acc_norm_stderr": 0.025167982333894143 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04426266681379909, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04426266681379909 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181015, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181015 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.45320197044334976, "acc_stderr": 0.03502544650845872, "acc_norm": 0.45320197044334976, "acc_norm_stderr": 0.03502544650845872 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.031922715695483, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.031922715695483 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8535353535353535, "acc_stderr": 0.025190921114603925, "acc_norm": 0.8535353535353535, "acc_norm_stderr": 0.025190921114603925 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328972, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328972 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563973, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563973 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2777777777777778, "acc_stderr": 0.027309140588230165, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.027309140588230165 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977927, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977927 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.40397350993377484, "acc_stderr": 0.0400648568536534, "acc_norm": 0.40397350993377484, "acc_norm_stderr": 0.0400648568536534 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.01555580271359016, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.01555580271359016 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5277777777777778, "acc_stderr": 0.0340470532865388, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455335, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455335 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8396624472573839, "acc_stderr": 0.02388438092596567, "acc_norm": 0.8396624472573839, "acc_norm_stderr": 0.02388438092596567 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7443946188340808, "acc_stderr": 0.029275891003969923, "acc_norm": 0.7443946188340808, "acc_norm_stderr": 0.029275891003969923 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7480916030534351, "acc_stderr": 0.03807387116306086, "acc_norm": 0.7480916030534351, "acc_norm_stderr": 0.03807387116306086 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5357142857142857, "acc_stderr": 0.04733667890053756, "acc_norm": 0.5357142857142857, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.023086635086841403, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.023086635086841403 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.013547415658662257, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.013547415658662257 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.023786203255508297, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.023786203255508297 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3687150837988827, "acc_stderr": 0.016135759015030122, "acc_norm": 0.3687150837988827, "acc_norm_stderr": 0.016135759015030122 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6830065359477124, "acc_stderr": 0.026643278474508755, "acc_norm": 0.6830065359477124, "acc_norm_stderr": 0.026643278474508755 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7266881028938906, "acc_stderr": 0.025311765975426122, "acc_norm": 0.7266881028938906, "acc_norm_stderr": 0.025311765975426122 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7654320987654321, "acc_stderr": 0.02357688174400571, "acc_norm": 0.7654320987654321, "acc_norm_stderr": 0.02357688174400571 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5208604954367666, "acc_stderr": 0.012759117066518008, "acc_norm": 0.5208604954367666, "acc_norm_stderr": 0.012759117066518008 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5955882352941176, "acc_stderr": 0.029812630701569743, "acc_norm": 0.5955882352941176, "acc_norm_stderr": 0.029812630701569743 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6797385620915033, "acc_stderr": 0.018875682938069443, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.018875682938069443 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6693877551020408, "acc_stderr": 0.030116426296540606, "acc_norm": 0.6693877551020408, "acc_norm_stderr": 0.030116426296540606 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7960199004975125, "acc_stderr": 0.02849317624532607, "acc_norm": 0.7960199004975125, "acc_norm_stderr": 0.02849317624532607 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5060240963855421, "acc_stderr": 0.03892212195333045, "acc_norm": 0.5060240963855421, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.45165238678090575, "mc1_stderr": 0.017421480300277643, "mc2": 0.6211306316728467, "mc2_stderr": 0.01529356194952766 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
yzhuang/autotree_automl_Higgs_gosdt_l256_d3_sd0
2023-09-04T20:30:25.000Z
[ "region:us" ]
yzhuang
null
null
null
0
0
--- dataset_info: features: - name: id dtype: int64 - name: input_x sequence: sequence: float64 - name: input_y sequence: sequence: float32 - name: rtg sequence: float64 - name: status sequence: sequence: float32 - name: split_threshold sequence: sequence: float64 - name: split_dimension sequence: int64 splits: - name: train num_bytes: 6255200000 num_examples: 100000 - name: validation num_bytes: 625520000 num_examples: 10000 download_size: 4937414841 dataset_size: 6880720000 --- # Dataset Card for "autotree_automl_Higgs_gosdt_l256_d3_sd0" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
MrezaPRZ/confidence_llama
2023-09-04T20:34:32.000Z
[ "region:us" ]
MrezaPRZ
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_jjaaaww__posi_13b
2023-09-17T21:53:21.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of jjaaaww/posi_13b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [jjaaaww/posi_13b](https://huggingface.co/jjaaaww/posi_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jjaaaww__posi_13b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-09-17T21:53:10.255526](https://huggingface.co/datasets/open-llm-leaderboard/details_jjaaaww__posi_13b/blob/main/results_2023-09-17T21-53-10.255526.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07309144295302013,\n\ \ \"em_stderr\": 0.0026655778131843967,\n \"f1\": 0.13575608221476448,\n\ \ \"f1_stderr\": 0.0028538148334764264,\n \"acc\": 0.3891760458073461,\n\ \ \"acc_stderr\": 0.007704559089096075\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.07309144295302013,\n \"em_stderr\": 0.0026655778131843967,\n\ \ \"f1\": 0.13575608221476448,\n \"f1_stderr\": 0.0028538148334764264\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \ \ \"acc_stderr\": 0.003447819272388996\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803153\n\ \ }\n}\n```" repo_url: https://huggingface.co/jjaaaww/posi_13b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|arc:challenge|25_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-04T20:34:43.210341.parquet' - config_name: harness_drop_3 data_files: - split: 2023_09_17T21_53_10.255526 path: - '**/details_harness|drop|3_2023-09-17T21-53-10.255526.parquet' - split: latest path: - '**/details_harness|drop|3_2023-09-17T21-53-10.255526.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_09_17T21_53_10.255526 path: - '**/details_harness|gsm8k|5_2023-09-17T21-53-10.255526.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-09-17T21-53-10.255526.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hellaswag|10_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-04T20:34:43.210341.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-management|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T20:34:43.210341.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_04T20_34_43.210341 path: - '**/details_harness|truthfulqa:mc|0_2023-09-04T20:34:43.210341.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-04T20:34:43.210341.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_09_17T21_53_10.255526 path: - '**/details_harness|winogrande|5_2023-09-17T21-53-10.255526.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-09-17T21-53-10.255526.parquet' - config_name: results data_files: - split: 2023_09_04T20_34_43.210341 path: - results_2023-09-04T20:34:43.210341.parquet - split: 2023_09_17T21_53_10.255526 path: - results_2023-09-17T21-53-10.255526.parquet - split: latest path: - results_2023-09-17T21-53-10.255526.parquet --- # Dataset Card for Evaluation run of jjaaaww/posi_13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/jjaaaww/posi_13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [jjaaaww/posi_13b](https://huggingface.co/jjaaaww/posi_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jjaaaww__posi_13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T21:53:10.255526](https://huggingface.co/datasets/open-llm-leaderboard/details_jjaaaww__posi_13b/blob/main/results_2023-09-17T21-53-10.255526.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.07309144295302013, "em_stderr": 0.0026655778131843967, "f1": 0.13575608221476448, "f1_stderr": 0.0028538148334764264, "acc": 0.3891760458073461, "acc_stderr": 0.007704559089096075 }, "harness|drop|3": { "em": 0.07309144295302013, "em_stderr": 0.0026655778131843967, "f1": 0.13575608221476448, "f1_stderr": 0.0028538148334764264 }, "harness|gsm8k|5": { "acc": 0.01592115238817286, "acc_stderr": 0.003447819272388996 }, "harness|winogrande|5": { "acc": 0.7624309392265194, "acc_stderr": 0.011961298905803153 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
YassineBenlaria/tamasheq_data_1
2023-09-04T20:57:06.000Z
[ "region:us" ]
YassineBenlaria
null
null
null
0
0
--- configs: - config_name: default data_files: - split: test path: data/test-* - split: train path: data/train-* - split: validation path: data/validation-* dataset_info: features: - name: path dtype: string - name: sentence dtype: string - name: audio dtype: audio: sampling_rate: 16000 - name: sentence_lat dtype: string splits: - name: test num_bytes: 3785121.0 num_examples: 18 - name: train num_bytes: 70490040.97552449 num_examples: 267 - name: validation num_bytes: 6424920.161290322 num_examples: 19 download_size: 77375800 dataset_size: 80700082.1368148 --- # Dataset Card for "tamasheq_data_1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
juselara1/mlds7_corpus
2023-09-04T21:01:39.000Z
[ "region:us" ]
juselara1
null
null
null
0
0
Entry not found
Junr-syl/movie_review
2023-09-04T22:11:05.000Z
[ "region:us" ]
Junr-syl
null
null
null
0
0
--- dataset_info: features: - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 7334957 num_examples: 5000 download_size: 0 dataset_size: 7334957 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "movie_review" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bhagmat/reg
2023-09-04T21:17:30.000Z
[ "region:us" ]
bhagmat
null
null
null
0
0
Entry not found
gauravvaid/python-code_samples
2023-09-06T10:50:18.000Z
[ "license:apache-2.0", "region:us" ]
gauravvaid
null
null
null
0
0
--- license: apache-2.0 ---
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Fintune_1_17w-gate_up_down_proj
2023-09-04T21:16:51.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-Fintune_1_17w-gate_up_down_proj dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-Fintune_1_17w-gate_up_down_proj](https://huggingface.co/CHIH-HUNG/llama-2-13b-Fintune_1_17w-gate_up_down_proj)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Fintune_1_17w-gate_up_down_proj\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-09-04T21:15:30.560082](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Fintune_1_17w-gate_up_down_proj/blob/main/results_2023-09-04T21%3A15%3A30.560082.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5475267614652215,\n\ \ \"acc_stderr\": 0.034609984402304776,\n \"acc_norm\": 0.5517087027206374,\n\ \ \"acc_norm_stderr\": 0.03459190488664288,\n \"mc1\": 0.25703794369645044,\n\ \ \"mc1_stderr\": 0.015298077509485076,\n \"mc2\": 0.369230031760143,\n\ \ \"mc2_stderr\": 0.013637900924748413\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5213310580204779,\n \"acc_stderr\": 0.014598087973127106,\n\ \ \"acc_norm\": 0.5614334470989761,\n \"acc_norm_stderr\": 0.014500682618212865\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6037641904003187,\n\ \ \"acc_stderr\": 0.0048811488668741845,\n \"acc_norm\": 0.8103963353913562,\n\ \ \"acc_norm_stderr\": 0.003911862797736198\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\ \ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\ \ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.0404633688397825,\n\ \ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.0404633688397825\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\ \ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \ \ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009794,\n\ \ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009794\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\ \ \"acc_stderr\": 0.04089465449325583,\n \"acc_norm\": 0.6041666666666666,\n\ \ \"acc_norm_stderr\": 0.04089465449325583\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\ \ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\ \ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\ \ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\ \ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\ \ \"acc_stderr\": 0.04372748290278005,\n \"acc_norm\": 0.3157894736842105,\n\ \ \"acc_norm_stderr\": 0.04372748290278005\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\ \ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342658,\n \"\ acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342658\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\ \ \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n\ \ \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.6580645161290323,\n \"acc_stderr\": 0.026985289576552742,\n \"\ acc_norm\": 0.6580645161290323,\n \"acc_norm_stderr\": 0.026985289576552742\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649037,\n \"\ acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649037\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\ : 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n\ \ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178816,\n \"\ acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178816\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.030031147977641538,\n\ \ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.030031147977641538\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.4794871794871795,\n \"acc_stderr\": 0.02532966316348994,\n \ \ \"acc_norm\": 0.4794871794871795,\n \"acc_norm_stderr\": 0.02532966316348994\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \ \ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n\ \ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\ acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"\ acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\ acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"\ acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \ \ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\ \ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\ \ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\ \ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\ : 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\ \ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\ \ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6319018404907976,\n \"acc_stderr\": 0.03789213935838396,\n\ \ \"acc_norm\": 0.6319018404907976,\n \"acc_norm_stderr\": 0.03789213935838396\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\ \ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\ \ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\ \ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\ \ \"acc_stderr\": 0.02441494730454368,\n \"acc_norm\": 0.8333333333333334,\n\ \ \"acc_norm_stderr\": 0.02441494730454368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \ \ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n\ \ \"acc_stderr\": 0.015438083080568973,\n \"acc_norm\": 0.7522349936143039,\n\ \ \"acc_norm_stderr\": 0.015438083080568973\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546665,\n\ \ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546665\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n\ \ \"acc_stderr\": 0.015551673652172537,\n \"acc_norm\": 0.31620111731843575,\n\ \ \"acc_norm_stderr\": 0.015551673652172537\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.027732834353363947,\n\ \ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.027732834353363947\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n\ \ \"acc_stderr\": 0.027690337536485372,\n \"acc_norm\": 0.6109324758842444,\n\ \ \"acc_norm_stderr\": 0.027690337536485372\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.0266756119260371,\n\ \ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.0266756119260371\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \ \ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4132985658409387,\n\ \ \"acc_stderr\": 0.012576779494860083,\n \"acc_norm\": 0.4132985658409387,\n\ \ \"acc_norm_stderr\": 0.012576779494860083\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.03023375855159643,\n\ \ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.03023375855159643\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5473856209150327,\n \"acc_stderr\": 0.020136790918492523,\n \ \ \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.020136790918492523\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\ \ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \ \ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.031512360446742695,\n\ \ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.031512360446742695\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\ \ \"acc_stderr\": 0.03251006816458619,\n \"acc_norm\": 0.6965174129353234,\n\ \ \"acc_norm_stderr\": 0.03251006816458619\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\ \ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\ \ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691583,\n\ \ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691583\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n\ \ \"mc1_stderr\": 0.015298077509485076,\n \"mc2\": 0.369230031760143,\n\ \ \"mc2_stderr\": 0.013637900924748413\n }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-Fintune_1_17w-gate_up_down_proj leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|arc:challenge|25_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hellaswag|10_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-04T21:15:30.560082.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-management|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T21:15:30.560082.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_04T21_15_30.560082 path: - '**/details_harness|truthfulqa:mc|0_2023-09-04T21:15:30.560082.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-04T21:15:30.560082.parquet' - config_name: results data_files: - split: 2023_09_04T21_15_30.560082 path: - results_2023-09-04T21:15:30.560082.parquet - split: latest path: - results_2023-09-04T21:15:30.560082.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-Fintune_1_17w-gate_up_down_proj ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-Fintune_1_17w-gate_up_down_proj - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-Fintune_1_17w-gate_up_down_proj](https://huggingface.co/CHIH-HUNG/llama-2-13b-Fintune_1_17w-gate_up_down_proj) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Fintune_1_17w-gate_up_down_proj", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-04T21:15:30.560082](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Fintune_1_17w-gate_up_down_proj/blob/main/results_2023-09-04T21%3A15%3A30.560082.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5475267614652215, "acc_stderr": 0.034609984402304776, "acc_norm": 0.5517087027206374, "acc_norm_stderr": 0.03459190488664288, "mc1": 0.25703794369645044, "mc1_stderr": 0.015298077509485076, "mc2": 0.369230031760143, "mc2_stderr": 0.013637900924748413 }, "harness|arc:challenge|25": { "acc": 0.5213310580204779, "acc_stderr": 0.014598087973127106, "acc_norm": 0.5614334470989761, "acc_norm_stderr": 0.014500682618212865 }, "harness|hellaswag|10": { "acc": 0.6037641904003187, "acc_stderr": 0.0048811488668741845, "acc_norm": 0.8103963353913562, "acc_norm_stderr": 0.003911862797736198 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.45925925925925926, "acc_stderr": 0.04304979692464242, "acc_norm": 0.45925925925925926, "acc_norm_stderr": 0.04304979692464242 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5526315789473685, "acc_stderr": 0.0404633688397825, "acc_norm": 0.5526315789473685, "acc_norm_stderr": 0.0404633688397825 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5886792452830188, "acc_stderr": 0.030285009259009794, "acc_norm": 0.5886792452830188, "acc_norm_stderr": 0.030285009259009794 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6041666666666666, "acc_stderr": 0.04089465449325583, "acc_norm": 0.6041666666666666, "acc_norm_stderr": 0.04089465449325583 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5144508670520231, "acc_stderr": 0.03810871630454764, "acc_norm": 0.5144508670520231, "acc_norm_stderr": 0.03810871630454764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.24509803921568626, "acc_stderr": 0.04280105837364395, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.04280105837364395 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.451063829787234, "acc_stderr": 0.032529096196131965, "acc_norm": 0.451063829787234, "acc_norm_stderr": 0.032529096196131965 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3157894736842105, "acc_stderr": 0.04372748290278005, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.04372748290278005 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5172413793103449, "acc_stderr": 0.04164188720169375, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.04164188720169375 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3492063492063492, "acc_stderr": 0.024552292209342658, "acc_norm": 0.3492063492063492, "acc_norm_stderr": 0.024552292209342658 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.38095238095238093, "acc_stderr": 0.04343525428949097, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.04343525428949097 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6580645161290323, "acc_stderr": 0.026985289576552742, "acc_norm": 0.6580645161290323, "acc_norm_stderr": 0.026985289576552742 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4236453201970443, "acc_stderr": 0.03476725747649037, "acc_norm": 0.4236453201970443, "acc_norm_stderr": 0.03476725747649037 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6787878787878788, "acc_stderr": 0.036462049632538115, "acc_norm": 0.6787878787878788, "acc_norm_stderr": 0.036462049632538115 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7070707070707071, "acc_stderr": 0.03242497958178816, "acc_norm": 0.7070707070707071, "acc_norm_stderr": 0.03242497958178816 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7772020725388601, "acc_stderr": 0.030031147977641538, "acc_norm": 0.7772020725388601, "acc_norm_stderr": 0.030031147977641538 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4794871794871795, "acc_stderr": 0.02532966316348994, "acc_norm": 0.4794871794871795, "acc_norm_stderr": 0.02532966316348994 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.028037929969114986, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.028037929969114986 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5378151260504201, "acc_stderr": 0.032385469487589795, "acc_norm": 0.5378151260504201, "acc_norm_stderr": 0.032385469487589795 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7486238532110092, "acc_stderr": 0.018599206360287415, "acc_norm": 0.7486238532110092, "acc_norm_stderr": 0.018599206360287415 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294636, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7303921568627451, "acc_stderr": 0.031145570659486782, "acc_norm": 0.7303921568627451, "acc_norm_stderr": 0.031145570659486782 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7341772151898734, "acc_stderr": 0.02875679962965834, "acc_norm": 0.7341772151898734, "acc_norm_stderr": 0.02875679962965834 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6322869955156951, "acc_stderr": 0.03236198350928275, "acc_norm": 0.6322869955156951, "acc_norm_stderr": 0.03236198350928275 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5954198473282443, "acc_stderr": 0.043046937953806645, "acc_norm": 0.5954198473282443, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.743801652892562, "acc_stderr": 0.03984979653302872, "acc_norm": 0.743801652892562, "acc_norm_stderr": 0.03984979653302872 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6851851851851852, "acc_stderr": 0.04489931073591312, "acc_norm": 0.6851851851851852, "acc_norm_stderr": 0.04489931073591312 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6319018404907976, "acc_stderr": 0.03789213935838396, "acc_norm": 0.6319018404907976, "acc_norm_stderr": 0.03789213935838396 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.7087378640776699, "acc_stderr": 0.04498676320572924, "acc_norm": 0.7087378640776699, "acc_norm_stderr": 0.04498676320572924 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8333333333333334, "acc_stderr": 0.02441494730454368, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.02441494730454368 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7522349936143039, "acc_stderr": 0.015438083080568973, "acc_norm": 0.7522349936143039, "acc_norm_stderr": 0.015438083080568973 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6560693641618497, "acc_stderr": 0.025574123786546665, "acc_norm": 0.6560693641618497, "acc_norm_stderr": 0.025574123786546665 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.31620111731843575, "acc_stderr": 0.015551673652172537, "acc_norm": 0.31620111731843575, "acc_norm_stderr": 0.015551673652172537 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6241830065359477, "acc_stderr": 0.027732834353363947, "acc_norm": 0.6241830065359477, "acc_norm_stderr": 0.027732834353363947 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6109324758842444, "acc_stderr": 0.027690337536485372, "acc_norm": 0.6109324758842444, "acc_norm_stderr": 0.027690337536485372 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6419753086419753, "acc_stderr": 0.0266756119260371, "acc_norm": 0.6419753086419753, "acc_norm_stderr": 0.0266756119260371 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4078014184397163, "acc_stderr": 0.029316011776343555, "acc_norm": 0.4078014184397163, "acc_norm_stderr": 0.029316011776343555 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4132985658409387, "acc_stderr": 0.012576779494860083, "acc_norm": 0.4132985658409387, "acc_norm_stderr": 0.012576779494860083 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5477941176470589, "acc_stderr": 0.03023375855159643, "acc_norm": 0.5477941176470589, "acc_norm_stderr": 0.03023375855159643 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5473856209150327, "acc_stderr": 0.020136790918492523, "acc_norm": 0.5473856209150327, "acc_norm_stderr": 0.020136790918492523 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6, "acc_stderr": 0.0469237132203465, "acc_norm": 0.6, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5877551020408164, "acc_stderr": 0.031512360446742695, "acc_norm": 0.5877551020408164, "acc_norm_stderr": 0.031512360446742695 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6965174129353234, "acc_stderr": 0.03251006816458619, "acc_norm": 0.6965174129353234, "acc_norm_stderr": 0.03251006816458619 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.42168674698795183, "acc_stderr": 0.03844453181770917, "acc_norm": 0.42168674698795183, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7251461988304093, "acc_stderr": 0.03424042924691583, "acc_norm": 0.7251461988304093, "acc_norm_stderr": 0.03424042924691583 }, "harness|truthfulqa:mc|0": { "mc1": 0.25703794369645044, "mc1_stderr": 0.015298077509485076, "mc2": 0.369230031760143, "mc2_stderr": 0.013637900924748413 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
kpola009/embedding_dataset
2023-09-04T21:22:39.000Z
[ "license:apache-2.0", "region:us" ]
kpola009
null
null
null
0
0
--- license: apache-2.0 ---
HydraLM/megacode2-min100-standardized
2023-09-04T21:37:42.000Z
[ "region:us" ]
HydraLM
null
null
null
0
0
--- dataset_info: features: - name: message dtype: string - name: message_type dtype: string - name: message_id dtype: int64 - name: conversation_id dtype: int64 splits: - name: train num_bytes: 981690152 num_examples: 1025272 - name: test num_bytes: 1059315 num_examples: 1114 download_size: 527732674 dataset_size: 982749467 --- # Dataset Card for "megacode2-min100-standardized" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
reinforz/pi_hackaprompt_squad
2023-09-05T08:51:43.000Z
[ "region:us" ]
reinforz
null
null
null
0
0
--- dataset_info: features: - name: text dtype: string - name: malicious dtype: bool splits: - name: train num_bytes: 78112891 num_examples: 396439 download_size: 24412673 dataset_size: 78112891 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "pi_hackaprompt_squad" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
LeoLM/German_Songs
2023-09-04T21:49:42.000Z
[ "region:us" ]
LeoLM
null
null
null
0
0
--- dataset_info: features: - name: prompt dtype: string - name: analysis_prompt dtype: string - name: topic dtype: string - name: song dtype: string - name: analysis dtype: string splits: - name: train num_bytes: 1972513 num_examples: 500 download_size: 804509 dataset_size: 1972513 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "german_songs_gpt4" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jamesWalker55/anime-scenes-vid
2023-09-06T00:57:49.000Z
[ "region:us" ]
jamesWalker55
null
null
null
0
0
Entry not found
jscode13/dog_data
2023-09-04T22:01:45.000Z
[ "region:us" ]
jscode13
null
null
null
0
0
Entry not found
Junr-syl/movie_review_1
2023-09-05T18:43:26.000Z
[ "region:us" ]
Junr-syl
null
null
null
0
0
--- dataset_info: features: - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 7334957 num_examples: 5000 download_size: 0 dataset_size: 7334957 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "movie_review_1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
nampdn-ai/mini-ultrachat
2023-09-05T00:15:02.000Z
[ "region:us" ]
nampdn-ai
null
null
null
2
0
Entry not found
matgu23/pg
2023-09-04T22:31:48.000Z
[ "region:us" ]
matgu23
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M
2023-09-23T08:39:13.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M](https://huggingface.co/synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-09-23T08:39:00.771555](https://huggingface.co/datasets/open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M/blob/main/results_2023-09-23T08-39-00.771555.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2627936241610738,\n\ \ \"em_stderr\": 0.004507560917898865,\n \"f1\": 0.30115981543624176,\n\ \ \"f1_stderr\": 0.004494140287139199,\n \"acc\": 0.3666975232366727,\n\ \ \"acc_stderr\": 0.008004674480789642\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.2627936241610738,\n \"em_stderr\": 0.004507560917898865,\n\ \ \"f1\": 0.30115981543624176,\n \"f1_stderr\": 0.004494140287139199\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.015163002274450341,\n \ \ \"acc_stderr\": 0.003366022949726345\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7182320441988951,\n \"acc_stderr\": 0.01264332601185294\n\ \ }\n}\n```" repo_url: https://huggingface.co/synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|arc:challenge|25_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-04T22:45:47.858606.parquet' - config_name: harness_drop_3 data_files: - split: 2023_09_23T08_39_00.771555 path: - '**/details_harness|drop|3_2023-09-23T08-39-00.771555.parquet' - split: latest path: - '**/details_harness|drop|3_2023-09-23T08-39-00.771555.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_09_23T08_39_00.771555 path: - '**/details_harness|gsm8k|5_2023-09-23T08-39-00.771555.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-09-23T08-39-00.771555.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hellaswag|10_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-04T22:45:47.858606.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-management|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T22:45:47.858606.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_04T22_45_47.858606 path: - '**/details_harness|truthfulqa:mc|0_2023-09-04T22:45:47.858606.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-04T22:45:47.858606.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_09_23T08_39_00.771555 path: - '**/details_harness|winogrande|5_2023-09-23T08-39-00.771555.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-09-23T08-39-00.771555.parquet' - config_name: results data_files: - split: 2023_09_04T22_45_47.858606 path: - results_2023-09-04T22:45:47.858606.parquet - split: 2023_09_23T08_39_00.771555 path: - results_2023-09-23T08-39-00.771555.parquet - split: latest path: - results_2023-09-23T08-39-00.771555.parquet --- # Dataset Card for Evaluation run of synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M](https://huggingface.co/synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T08:39:00.771555](https://huggingface.co/datasets/open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M/blob/main/results_2023-09-23T08-39-00.771555.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2627936241610738, "em_stderr": 0.004507560917898865, "f1": 0.30115981543624176, "f1_stderr": 0.004494140287139199, "acc": 0.3666975232366727, "acc_stderr": 0.008004674480789642 }, "harness|drop|3": { "em": 0.2627936241610738, "em_stderr": 0.004507560917898865, "f1": 0.30115981543624176, "f1_stderr": 0.004494140287139199 }, "harness|gsm8k|5": { "acc": 0.015163002274450341, "acc_stderr": 0.003366022949726345 }, "harness|winogrande|5": { "acc": 0.7182320441988951, "acc_stderr": 0.01264332601185294 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
csfsd/csmov
2023-09-04T23:55:08.000Z
[ "region:us" ]
csfsd
null
null
null
0
0
Entry not found
Ayaka4543/gahsbkjans34
2023-09-04T23:58:07.000Z
[ "region:us" ]
Ayaka4543
null
null
null
0
0
Entry not found
hanny21/newscs
2023-09-05T00:21:05.000Z
[ "region:us" ]
hanny21
null
null
null
0
0
Entry not found
KhalfounMehdi/mura_dataset_processed_224px_train_val_with_labels
2023-09-05T00:57:11.000Z
[ "region:us" ]
KhalfounMehdi
null
null
null
0
0
--- configs: - config_name: default data_files: - split: test path: data/test-* - split: train path: data/train-* - split: validation path: data/validation-* dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': abnormal '1': normal - name: labels dtype: int64 splits: - name: test num_bytes: 99750354.875 num_examples: 4001 - name: train num_bytes: 897948950.5 num_examples: 36004 - name: validation num_bytes: 99750354.875 num_examples: 4001 download_size: 1097501239 dataset_size: 1097449660.25 --- # Dataset Card for "mura_dataset_processed_224px_train_val_with_labels" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
honglinggoh/products_desc_and_marktng_emails_dataset
2023-09-05T01:15:09.000Z
[ "region:us" ]
honglinggoh
null
null
null
0
0
--- dataset_info: features: - name: product dtype: string - name: description dtype: string - name: marketing_email dtype: string splits: - name: train num_bytes: 25130 num_examples: 13 download_size: 27570 dataset_size: 25130 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "products_desc_and_marktng_emails_dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jschew39/genai_marketmail_sample
2023-09-05T01:20:19.000Z
[ "region:us" ]
jschew39
null
null
null
0
0
--- dataset_info: features: - name: product dtype: string - name: description dtype: string - name: marketing_email dtype: string splits: - name: train num_bytes: 15396 num_examples: 8 download_size: 23299 dataset_size: 15396 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "genai_marketmail_sample" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sparkyfina/dino_marketing_emails
2023-09-05T01:24:48.000Z
[ "region:us" ]
sparkyfina
null
null
null
0
0
--- dataset_info: features: - name: product dtype: string - name: description dtype: string - name: marketing_email dtype: string splits: - name: train num_bytes: 37399 num_examples: 20 download_size: 33872 dataset_size: 37399 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "dino_marketing_emails" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
unmeshk/fb-workshop-09042023-marketing-email-ds
2023-09-05T01:25:18.000Z
[ "region:us" ]
unmeshk
null
null
null
0
0
--- dataset_info: features: - name: product dtype: string - name: description dtype: string - name: marketing_email dtype: string splits: - name: train num_bytes: 18423 num_examples: 10 download_size: 24475 dataset_size: 18423 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "fb-workshop-09042023-marketing-email-ds" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
amalina-k/marketing-mail
2023-09-05T01:40:59.000Z
[ "region:us" ]
amalina-k
null
null
null
0
0
--- dataset_info: features: - name: product dtype: string - name: description dtype: string - name: marketing_email dtype: string splits: - name: train num_bytes: 7993 num_examples: 5 download_size: 16846 dataset_size: 7993 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "marketing-mail" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_Kiddyz__testlm-3
2023-09-05T01:44:05.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Kiddyz/testlm-3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Kiddyz/testlm-3](https://huggingface.co/Kiddyz/testlm-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kiddyz__testlm-3\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-09-05T01:42:44.018659](https://huggingface.co/datasets/open-llm-leaderboard/details_Kiddyz__testlm-3/blob/main/results_2023-09-05T01%3A42%3A44.018659.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5187353874495172,\n\ \ \"acc_stderr\": 0.035029744575697866,\n \"acc_norm\": 0.5224927626027567,\n\ \ \"acc_norm_stderr\": 0.03501563654467944,\n \"mc1\": 0.3157894736842105,\n\ \ \"mc1_stderr\": 0.016272287957916916,\n \"mc2\": 0.46416337242431305,\n\ \ \"mc2_stderr\": 0.015037341919156266\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5025597269624573,\n \"acc_stderr\": 0.014611199329843784,\n\ \ \"acc_norm\": 0.5358361774744027,\n \"acc_norm_stderr\": 0.014573813664735718\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5963951404102769,\n\ \ \"acc_stderr\": 0.004896173035943312,\n \"acc_norm\": 0.7848038239394542,\n\ \ \"acc_norm_stderr\": 0.00410118487096419\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\ \ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\ \ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\ \ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\ \ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731837,\n\ \ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731837\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\ \ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n\ \ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\ \ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\ \ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n\ \ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.45664739884393063,\n\ \ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793254,\n\ \ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793254\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\ \ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788684,\n\ \ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788684\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\ \ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\ \ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.04161808503501528,\n\ \ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.04161808503501528\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899208,\n \"\ acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899208\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\ \ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\ \ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.5967741935483871,\n \"acc_stderr\": 0.027906150826041146,\n \"\ acc_norm\": 0.5967741935483871,\n \"acc_norm_stderr\": 0.027906150826041146\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n \"\ acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\ : 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\ \ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6414141414141414,\n \"acc_stderr\": 0.03416903640391521,\n \"\ acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.03416903640391521\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.03257714077709662,\n\ \ \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.03257714077709662\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.49743589743589745,\n \"acc_stderr\": 0.025350672979412202,\n\ \ \"acc_norm\": 0.49743589743589745,\n \"acc_norm_stderr\": 0.025350672979412202\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \ \ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.46638655462184875,\n \"acc_stderr\": 0.03240501447690071,\n\ \ \"acc_norm\": 0.46638655462184875,\n \"acc_norm_stderr\": 0.03240501447690071\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\ acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7009174311926606,\n \"acc_stderr\": 0.019630417285415182,\n \"\ acc_norm\": 0.7009174311926606,\n \"acc_norm_stderr\": 0.019630417285415182\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\ : 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\ \ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7058823529411765,\n\ \ \"acc_stderr\": 0.03198001660115072,\n \"acc_norm\": 0.7058823529411765,\n\ \ \"acc_norm_stderr\": 0.03198001660115072\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n\ \ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\ \ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\ \ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\ \ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.628099173553719,\n \"acc_stderr\": 0.04412015806624505,\n \"acc_norm\"\ : 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624505\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n\ \ \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n\ \ \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.038741028598180814,\n\ \ \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.038741028598180814\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\ \ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\ \ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\ \ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n\ \ \"acc_stderr\": 0.0282863240755644,\n \"acc_norm\": 0.7521367521367521,\n\ \ \"acc_norm_stderr\": 0.0282863240755644\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \ \ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.698595146871009,\n\ \ \"acc_stderr\": 0.016409091097268784,\n \"acc_norm\": 0.698595146871009,\n\ \ \"acc_norm_stderr\": 0.016409091097268784\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.026817718130348916,\n\ \ \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.026817718130348916\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41899441340782123,\n\ \ \"acc_stderr\": 0.016501579306861674,\n \"acc_norm\": 0.41899441340782123,\n\ \ \"acc_norm_stderr\": 0.016501579306861674\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824093,\n\ \ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824093\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\ \ \"acc_stderr\": 0.02791705074848463,\n \"acc_norm\": 0.5916398713826366,\n\ \ \"acc_norm_stderr\": 0.02791705074848463\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668773,\n\ \ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668773\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.3546099290780142,\n \"acc_stderr\": 0.02853865002887863,\n \ \ \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.02853865002887863\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38396349413298564,\n\ \ \"acc_stderr\": 0.012421587833134231,\n \"acc_norm\": 0.38396349413298564,\n\ \ \"acc_norm_stderr\": 0.012421587833134231\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.030105636570016633,\n\ \ \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.030105636570016633\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.4493464052287582,\n \"acc_stderr\": 0.020123766528027262,\n \ \ \"acc_norm\": 0.4493464052287582,\n \"acc_norm_stderr\": 0.020123766528027262\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\ \ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\ \ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789855,\n\ \ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789855\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\ \ \"acc_stderr\": 0.03235743789355043,\n \"acc_norm\": 0.7014925373134329,\n\ \ \"acc_norm_stderr\": 0.03235743789355043\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\ \ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\ \ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.03446296217088427,\n\ \ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.03446296217088427\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3157894736842105,\n\ \ \"mc1_stderr\": 0.016272287957916916,\n \"mc2\": 0.46416337242431305,\n\ \ \"mc2_stderr\": 0.015037341919156266\n }\n}\n```" repo_url: https://huggingface.co/Kiddyz/testlm-3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|arc:challenge|25_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hellaswag|10_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T01:42:44.018659.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T01:42:44.018659.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_05T01_42_44.018659 path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T01:42:44.018659.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T01:42:44.018659.parquet' - config_name: results data_files: - split: 2023_09_05T01_42_44.018659 path: - results_2023-09-05T01:42:44.018659.parquet - split: latest path: - results_2023-09-05T01:42:44.018659.parquet --- # Dataset Card for Evaluation run of Kiddyz/testlm-3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Kiddyz/testlm-3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Kiddyz/testlm-3](https://huggingface.co/Kiddyz/testlm-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Kiddyz__testlm-3", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-05T01:42:44.018659](https://huggingface.co/datasets/open-llm-leaderboard/details_Kiddyz__testlm-3/blob/main/results_2023-09-05T01%3A42%3A44.018659.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5187353874495172, "acc_stderr": 0.035029744575697866, "acc_norm": 0.5224927626027567, "acc_norm_stderr": 0.03501563654467944, "mc1": 0.3157894736842105, "mc1_stderr": 0.016272287957916916, "mc2": 0.46416337242431305, "mc2_stderr": 0.015037341919156266 }, "harness|arc:challenge|25": { "acc": 0.5025597269624573, "acc_stderr": 0.014611199329843784, "acc_norm": 0.5358361774744027, "acc_norm_stderr": 0.014573813664735718 }, "harness|hellaswag|10": { "acc": 0.5963951404102769, "acc_stderr": 0.004896173035943312, "acc_norm": 0.7848038239394542, "acc_norm_stderr": 0.00410118487096419 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.45925925925925926, "acc_stderr": 0.04304979692464243, "acc_norm": 0.45925925925925926, "acc_norm_stderr": 0.04304979692464243 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.47368421052631576, "acc_stderr": 0.04063302731486671, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.04063302731486671 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5584905660377358, "acc_stderr": 0.030561590426731837, "acc_norm": 0.5584905660377358, "acc_norm_stderr": 0.030561590426731837 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5208333333333334, "acc_stderr": 0.041775789507399935, "acc_norm": 0.5208333333333334, "acc_norm_stderr": 0.041775789507399935 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.45664739884393063, "acc_stderr": 0.03798106566014498, "acc_norm": 0.45664739884393063, "acc_norm_stderr": 0.03798106566014498 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.27450980392156865, "acc_stderr": 0.044405219061793254, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.044405219061793254 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.40425531914893614, "acc_stderr": 0.03208115750788684, "acc_norm": 0.40425531914893614, "acc_norm_stderr": 0.03208115750788684 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.04185774424022056, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.04185774424022056 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.47586206896551725, "acc_stderr": 0.04161808503501528, "acc_norm": 0.47586206896551725, "acc_norm_stderr": 0.04161808503501528 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.31746031746031744, "acc_stderr": 0.02397386199899208, "acc_norm": 0.31746031746031744, "acc_norm_stderr": 0.02397386199899208 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3888888888888889, "acc_stderr": 0.04360314860077459, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.04360314860077459 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5967741935483871, "acc_stderr": 0.027906150826041146, "acc_norm": 0.5967741935483871, "acc_norm_stderr": 0.027906150826041146 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.35467980295566504, "acc_stderr": 0.0336612448905145, "acc_norm": 0.35467980295566504, "acc_norm_stderr": 0.0336612448905145 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6848484848484848, "acc_stderr": 0.0362773057502241, "acc_norm": 0.6848484848484848, "acc_norm_stderr": 0.0362773057502241 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6414141414141414, "acc_stderr": 0.03416903640391521, "acc_norm": 0.6414141414141414, "acc_norm_stderr": 0.03416903640391521 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7150259067357513, "acc_stderr": 0.03257714077709662, "acc_norm": 0.7150259067357513, "acc_norm_stderr": 0.03257714077709662 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.49743589743589745, "acc_stderr": 0.025350672979412202, "acc_norm": 0.49743589743589745, "acc_norm_stderr": 0.025350672979412202 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.28888888888888886, "acc_stderr": 0.027634907264178544, "acc_norm": 0.28888888888888886, "acc_norm_stderr": 0.027634907264178544 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.46638655462184875, "acc_stderr": 0.03240501447690071, "acc_norm": 0.46638655462184875, "acc_norm_stderr": 0.03240501447690071 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7009174311926606, "acc_stderr": 0.019630417285415182, "acc_norm": 0.7009174311926606, "acc_norm_stderr": 0.019630417285415182 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4722222222222222, "acc_stderr": 0.0340470532865388, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7058823529411765, "acc_stderr": 0.03198001660115072, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.03198001660115072 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7383966244725738, "acc_stderr": 0.028609516716994934, "acc_norm": 0.7383966244725738, "acc_norm_stderr": 0.028609516716994934 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.600896860986547, "acc_stderr": 0.03286745312567961, "acc_norm": 0.600896860986547, "acc_norm_stderr": 0.03286745312567961 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6106870229007634, "acc_stderr": 0.04276486542814591, "acc_norm": 0.6106870229007634, "acc_norm_stderr": 0.04276486542814591 }, "harness|hendrycksTest-international_law|5": { "acc": 0.628099173553719, "acc_stderr": 0.04412015806624505, "acc_norm": 0.628099173553719, "acc_norm_stderr": 0.04412015806624505 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6296296296296297, "acc_stderr": 0.04668408033024931, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.04668408033024931 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5828220858895705, "acc_stderr": 0.038741028598180814, "acc_norm": 0.5828220858895705, "acc_norm_stderr": 0.038741028598180814 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3392857142857143, "acc_stderr": 0.04493949068613539, "acc_norm": 0.3392857142857143, "acc_norm_stderr": 0.04493949068613539 }, "harness|hendrycksTest-management|5": { "acc": 0.6699029126213593, "acc_stderr": 0.0465614711001235, "acc_norm": 0.6699029126213593, "acc_norm_stderr": 0.0465614711001235 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7521367521367521, "acc_stderr": 0.0282863240755644, "acc_norm": 0.7521367521367521, "acc_norm_stderr": 0.0282863240755644 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.698595146871009, "acc_stderr": 0.016409091097268784, "acc_norm": 0.698595146871009, "acc_norm_stderr": 0.016409091097268784 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5433526011560693, "acc_stderr": 0.026817718130348916, "acc_norm": 0.5433526011560693, "acc_norm_stderr": 0.026817718130348916 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41899441340782123, "acc_stderr": 0.016501579306861674, "acc_norm": 0.41899441340782123, "acc_norm_stderr": 0.016501579306861674 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5849673202614379, "acc_stderr": 0.028213504177824093, "acc_norm": 0.5849673202614379, "acc_norm_stderr": 0.028213504177824093 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5916398713826366, "acc_stderr": 0.02791705074848463, "acc_norm": 0.5916398713826366, "acc_norm_stderr": 0.02791705074848463 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.558641975308642, "acc_stderr": 0.027628737155668773, "acc_norm": 0.558641975308642, "acc_norm_stderr": 0.027628737155668773 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3546099290780142, "acc_stderr": 0.02853865002887863, "acc_norm": 0.3546099290780142, "acc_norm_stderr": 0.02853865002887863 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.38396349413298564, "acc_stderr": 0.012421587833134231, "acc_norm": 0.38396349413298564, "acc_norm_stderr": 0.012421587833134231 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5661764705882353, "acc_stderr": 0.030105636570016633, "acc_norm": 0.5661764705882353, "acc_norm_stderr": 0.030105636570016633 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4493464052287582, "acc_stderr": 0.020123766528027262, "acc_norm": 0.4493464052287582, "acc_norm_stderr": 0.020123766528027262 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5909090909090909, "acc_stderr": 0.04709306978661895, "acc_norm": 0.5909090909090909, "acc_norm_stderr": 0.04709306978661895 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6285714285714286, "acc_stderr": 0.030932858792789855, "acc_norm": 0.6285714285714286, "acc_norm_stderr": 0.030932858792789855 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7014925373134329, "acc_stderr": 0.03235743789355043, "acc_norm": 0.7014925373134329, "acc_norm_stderr": 0.03235743789355043 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-virology|5": { "acc": 0.4036144578313253, "acc_stderr": 0.038194861407583984, "acc_norm": 0.4036144578313253, "acc_norm_stderr": 0.038194861407583984 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7192982456140351, "acc_stderr": 0.03446296217088427, "acc_norm": 0.7192982456140351, "acc_norm_stderr": 0.03446296217088427 }, "harness|truthfulqa:mc|0": { "mc1": 0.3157894736842105, "mc1_stderr": 0.016272287957916916, "mc2": 0.46416337242431305, "mc2_stderr": 0.015037341919156266 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
faisalq/AraPoems
2023-09-05T02:48:41.000Z
[ "license:afl-3.0", "region:us" ]
faisalq
null
null
null
0
0
--- license: afl-3.0 ---
krishanusinha20/Krishanu_Dataset
2023-09-05T01:47:27.000Z
[ "region:us" ]
krishanusinha20
null
null
null
0
0
--- dataset_info: features: - name: product dtype: string - name: description dtype: string - name: marketing_email dtype: string splits: - name: train num_bytes: 20991 num_examples: 10 download_size: 30027 dataset_size: 20991 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "Krishanu_Dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bsmock/FinTabNet.c
2023-09-07T04:50:07.000Z
[ "license:cdla-permissive-2.0", "table structure recognition", "table extraction", "arxiv:2303.00716", "region:us" ]
bsmock
null
null
null
0
0
--- license: cdla-permissive-2.0 tags: - table structure recognition - table extraction --- # FinTabNet.c The FinTabNet.c dataset was released in 2023. You can think of FinTabNet.c as a fork (a modified version, in this case by different authors) of the original FinTabNet dataset. FinTabNet.c contains: - automated corrections of FinTabNet (such as canonicalization) to correct oversegmentation and to make the dataset more consistent with other TSR datasets, like PubTables-1M - fewer samples than FinTabNet, where samples were removed whose annotations could not be either automatically processed, corrected, or verified For more details about this version (2023) of the dataset and the adjustments made to the original dataset, please see ["Aligning benchmark datasets for table structure recognition"](https://arxiv.org/abs/2303.00716). For the code used to create this dataset, see [https://github.com/microsoft/table-transformer](https://github.com/microsoft/table-transformer). ## Citing If you use this dataset in your published work, please cite: ``` @article{smock2023aligning, title={Aligning benchmark datasets for table structure recognition}, author={Smock, Brandon and Pesala, Rohith and Abraham, Robin}, booktitle={International Conference on Document Analysis and Recognition}, pages={371--386}, year={2023}, organization={Springer} } ``` ## About the original FinTabNet dataset Please see: [https://developer.ibm.com/data/fintabnet/](https://developer.ibm.com/data/fintabnet/) (link last checked September 2023). ### Original license According to the dataset website, the license of the original FinTabNet dataset is [CDLA-Permissive](https://cdla.dev/permissive-1-0/).
BBINS/RVC_test_01
2023-09-05T02:20:12.000Z
[ "region:us" ]
BBINS
null
null
null
0
0
Entry not found
cun-bjy/mpi3d_real
2023-09-05T03:49:03.000Z
[ "task_categories:feature-extraction", "size_categories:100M<n<1B", "language:ar", "license:bsd", "code", "region:us" ]
cun-bjy
null
null
null
1
0
--- license: bsd task_categories: - feature-extraction language: - ar tags: - code pretty_name: mpi3d_real size_categories: - 100M<n<1B --- # mpi3d_real This repository is a _unofficial_ backup service for `MPI3D-Real` dataset, which is provided in this paper: [**On the Transfer of Inductive Bias from Simulation to the Real World: a New Disentanglement Dataset**](https://proceedings.neurips.cc/paper/2019/hash/d97d404b6119214e4a7018391195240a-Abstract.html). The dataset contains images of real objects with varying factors of variation, such as shape, color, texture and pose. The dataset is useful for studying the generalization and disentanglement abilities of representation learning models. The backup service allows users to download the dataset from a mirror site in case the original source is unavailable. For more detailed information on the dataset, please check the [original repository](https://github.com/rr-learning/disentanglement_dataset) ## Reference [1] Gondal, Muhammad Waleed, et al. "On the transfer of inductive bias from simulation to the real world: a new disentanglement dataset." Advances in Neural Information Processing Systems 32 (2019).
open-llm-leaderboard/details_Undi95__Nous-Hermes-13B-Code
2023-09-05T02:43:24.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Undi95/Nous-Hermes-13B-Code dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Undi95/Nous-Hermes-13B-Code](https://huggingface.co/Undi95/Nous-Hermes-13B-Code)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Nous-Hermes-13B-Code\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-09-05T02:42:01.860222](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Nous-Hermes-13B-Code/blob/main/results_2023-09-05T02%3A42%3A01.860222.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5531820683673332,\n\ \ \"acc_stderr\": 0.03477068808309721,\n \"acc_norm\": 0.5571005157301979,\n\ \ \"acc_norm_stderr\": 0.034749271399370084,\n \"mc1\": 0.35006119951040393,\n\ \ \"mc1_stderr\": 0.016697949420151032,\n \"mc2\": 0.505550629963065,\n\ \ \"mc2_stderr\": 0.01590974900800537\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5793515358361775,\n \"acc_stderr\": 0.014426211252508394,\n\ \ \"acc_norm\": 0.6117747440273038,\n \"acc_norm_stderr\": 0.014241614207414046\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6333399721171081,\n\ \ \"acc_stderr\": 0.004809077205343495,\n \"acc_norm\": 0.832105158334993,\n\ \ \"acc_norm_stderr\": 0.0037300899105375787\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\ \ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\ \ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\ \ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\ \ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \ \ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\ \ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\ \ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\ \ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\ \ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\ \ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\ \ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\ \ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\ \ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\ \ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\ \ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\ \ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\ \ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.335978835978836,\n \"acc_stderr\": 0.024326310529149138,\n \"\ acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149138\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\ \ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\ \ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.6290322580645161,\n \"acc_stderr\": 0.027480541887953586,\n \"\ acc_norm\": 0.6290322580645161,\n \"acc_norm_stderr\": 0.027480541887953586\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.43349753694581283,\n \"acc_stderr\": 0.034867317274198714,\n \"\ acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.034867317274198714\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\ : 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n\ \ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\ acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.03074890536390989,\n\ \ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.03074890536390989\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.02533900301010651,\n \ \ \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.02533900301010651\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028604,\n \ \ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028604\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\ \ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849928,\n \"\ acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849928\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7302752293577982,\n \"acc_stderr\": 0.01902848671111544,\n \"\ acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.01902848671111544\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\ acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814563,\n \"\ acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814563\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842538,\n \ \ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842538\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\ \ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.6098654708520179,\n\ \ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262971,\n\ \ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262971\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\ acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\ \ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\ \ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\ \ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\ \ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\ \ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326468,\n\ \ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326468\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7606837606837606,\n\ \ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.7606837606837606,\n\ \ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \ \ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n\ \ \"acc_stderr\": 0.015104550008905718,\n \"acc_norm\": 0.7675606641123882,\n\ \ \"acc_norm_stderr\": 0.015104550008905718\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.02651126136940924,\n\ \ \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.02651126136940924\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n\ \ \"acc_stderr\": 0.0162046723851066,\n \"acc_norm\": 0.376536312849162,\n\ \ \"acc_norm_stderr\": 0.0162046723851066\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.028275490156791455,\n\ \ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.028275490156791455\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\ \ \"acc_stderr\": 0.02760468902858198,\n \"acc_norm\": 0.617363344051447,\n\ \ \"acc_norm_stderr\": 0.02760468902858198\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\ \ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \ \ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4152542372881356,\n\ \ \"acc_stderr\": 0.01258547179340066,\n \"acc_norm\": 0.4152542372881356,\n\ \ \"acc_norm_stderr\": 0.01258547179340066\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.030306257722468317,\n\ \ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.030306257722468317\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5441176470588235,\n \"acc_stderr\": 0.020148939420415745,\n \ \ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.020148939420415745\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\ \ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\ \ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5836734693877551,\n \"acc_stderr\": 0.031557828165561644,\n\ \ \"acc_norm\": 0.5836734693877551,\n \"acc_norm_stderr\": 0.031557828165561644\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\ \ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.7512437810945274,\n\ \ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \ \ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\ \ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.43373493975903615,\n\ \ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.03218093795602357,\n\ \ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.03218093795602357\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35006119951040393,\n\ \ \"mc1_stderr\": 0.016697949420151032,\n \"mc2\": 0.505550629963065,\n\ \ \"mc2_stderr\": 0.01590974900800537\n }\n}\n```" repo_url: https://huggingface.co/Undi95/Nous-Hermes-13B-Code leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|arc:challenge|25_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hellaswag|10_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T02:42:01.860222.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:42:01.860222.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_05T02_42_01.860222 path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T02:42:01.860222.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T02:42:01.860222.parquet' - config_name: results data_files: - split: 2023_09_05T02_42_01.860222 path: - results_2023-09-05T02:42:01.860222.parquet - split: latest path: - results_2023-09-05T02:42:01.860222.parquet --- # Dataset Card for Evaluation run of Undi95/Nous-Hermes-13B-Code ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Undi95/Nous-Hermes-13B-Code - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Undi95/Nous-Hermes-13B-Code](https://huggingface.co/Undi95/Nous-Hermes-13B-Code) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Undi95__Nous-Hermes-13B-Code", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-05T02:42:01.860222](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Nous-Hermes-13B-Code/blob/main/results_2023-09-05T02%3A42%3A01.860222.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5531820683673332, "acc_stderr": 0.03477068808309721, "acc_norm": 0.5571005157301979, "acc_norm_stderr": 0.034749271399370084, "mc1": 0.35006119951040393, "mc1_stderr": 0.016697949420151032, "mc2": 0.505550629963065, "mc2_stderr": 0.01590974900800537 }, "harness|arc:challenge|25": { "acc": 0.5793515358361775, "acc_stderr": 0.014426211252508394, "acc_norm": 0.6117747440273038, "acc_norm_stderr": 0.014241614207414046 }, "harness|hellaswag|10": { "acc": 0.6333399721171081, "acc_stderr": 0.004809077205343495, "acc_norm": 0.832105158334993, "acc_norm_stderr": 0.0037300899105375787 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5037037037037037, "acc_stderr": 0.04319223625811331, "acc_norm": 0.5037037037037037, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5394736842105263, "acc_stderr": 0.04056242252249033, "acc_norm": 0.5394736842105263, "acc_norm_stderr": 0.04056242252249033 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5773584905660377, "acc_stderr": 0.03040233144576954, "acc_norm": 0.5773584905660377, "acc_norm_stderr": 0.03040233144576954 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6388888888888888, "acc_stderr": 0.04016660030451233, "acc_norm": 0.6388888888888888, "acc_norm_stderr": 0.04016660030451233 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5144508670520231, "acc_stderr": 0.03810871630454764, "acc_norm": 0.5144508670520231, "acc_norm_stderr": 0.03810871630454764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.27450980392156865, "acc_stderr": 0.04440521906179328, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.04440521906179328 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.46382978723404256, "acc_stderr": 0.032600385118357715, "acc_norm": 0.46382978723404256, "acc_norm_stderr": 0.032600385118357715 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3157894736842105, "acc_stderr": 0.043727482902780064, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.043727482902780064 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.47586206896551725, "acc_stderr": 0.041618085035015295, "acc_norm": 0.47586206896551725, "acc_norm_stderr": 0.041618085035015295 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.335978835978836, "acc_stderr": 0.024326310529149138, "acc_norm": 0.335978835978836, "acc_norm_stderr": 0.024326310529149138 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.38095238095238093, "acc_stderr": 0.043435254289490965, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.043435254289490965 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6290322580645161, "acc_stderr": 0.027480541887953586, "acc_norm": 0.6290322580645161, "acc_norm_stderr": 0.027480541887953586 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.43349753694581283, "acc_stderr": 0.034867317274198714, "acc_norm": 0.43349753694581283, "acc_norm_stderr": 0.034867317274198714 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6242424242424243, "acc_stderr": 0.03781887353205982, "acc_norm": 0.6242424242424243, "acc_norm_stderr": 0.03781887353205982 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6868686868686869, "acc_stderr": 0.033042050878136525, "acc_norm": 0.6868686868686869, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7616580310880829, "acc_stderr": 0.03074890536390989, "acc_norm": 0.7616580310880829, "acc_norm_stderr": 0.03074890536390989 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4846153846153846, "acc_stderr": 0.02533900301010651, "acc_norm": 0.4846153846153846, "acc_norm_stderr": 0.02533900301010651 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028604, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028604 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5588235294117647, "acc_stderr": 0.032252942323996406, "acc_norm": 0.5588235294117647, "acc_norm_stderr": 0.032252942323996406 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4105960264900662, "acc_stderr": 0.04016689594849928, "acc_norm": 0.4105960264900662, "acc_norm_stderr": 0.04016689594849928 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7302752293577982, "acc_stderr": 0.01902848671111544, "acc_norm": 0.7302752293577982, "acc_norm_stderr": 0.01902848671111544 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.47685185185185186, "acc_stderr": 0.03406315360711507, "acc_norm": 0.47685185185185186, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7647058823529411, "acc_stderr": 0.02977177522814563, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.02977177522814563 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7215189873417721, "acc_stderr": 0.029178682304842538, "acc_norm": 0.7215189873417721, "acc_norm_stderr": 0.029178682304842538 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6098654708520179, "acc_stderr": 0.03273766725459157, "acc_norm": 0.6098654708520179, "acc_norm_stderr": 0.03273766725459157 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5801526717557252, "acc_stderr": 0.04328577215262971, "acc_norm": 0.5801526717557252, "acc_norm_stderr": 0.04328577215262971 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04065578140908706, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04065578140908706 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7037037037037037, "acc_stderr": 0.044143436668549335, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.044143436668549335 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6625766871165644, "acc_stderr": 0.03714908409935574, "acc_norm": 0.6625766871165644, "acc_norm_stderr": 0.03714908409935574 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3482142857142857, "acc_stderr": 0.04521829902833586, "acc_norm": 0.3482142857142857, "acc_norm_stderr": 0.04521829902833586 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.04453254836326468, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.04453254836326468 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7606837606837606, "acc_stderr": 0.027951826808924333, "acc_norm": 0.7606837606837606, "acc_norm_stderr": 0.027951826808924333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7675606641123882, "acc_stderr": 0.015104550008905718, "acc_norm": 0.7675606641123882, "acc_norm_stderr": 0.015104550008905718 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5867052023121387, "acc_stderr": 0.02651126136940924, "acc_norm": 0.5867052023121387, "acc_norm_stderr": 0.02651126136940924 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.376536312849162, "acc_stderr": 0.0162046723851066, "acc_norm": 0.376536312849162, "acc_norm_stderr": 0.0162046723851066 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5784313725490197, "acc_stderr": 0.028275490156791455, "acc_norm": 0.5784313725490197, "acc_norm_stderr": 0.028275490156791455 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.617363344051447, "acc_stderr": 0.02760468902858198, "acc_norm": 0.617363344051447, "acc_norm_stderr": 0.02760468902858198 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6327160493827161, "acc_stderr": 0.026822801759507894, "acc_norm": 0.6327160493827161, "acc_norm_stderr": 0.026822801759507894 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.40425531914893614, "acc_stderr": 0.029275532159704725, "acc_norm": 0.40425531914893614, "acc_norm_stderr": 0.029275532159704725 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4152542372881356, "acc_stderr": 0.01258547179340066, "acc_norm": 0.4152542372881356, "acc_norm_stderr": 0.01258547179340066 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5330882352941176, "acc_stderr": 0.030306257722468317, "acc_norm": 0.5330882352941176, "acc_norm_stderr": 0.030306257722468317 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5441176470588235, "acc_stderr": 0.020148939420415745, "acc_norm": 0.5441176470588235, "acc_norm_stderr": 0.020148939420415745 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5818181818181818, "acc_stderr": 0.04724577405731572, "acc_norm": 0.5818181818181818, "acc_norm_stderr": 0.04724577405731572 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5836734693877551, "acc_stderr": 0.031557828165561644, "acc_norm": 0.5836734693877551, "acc_norm_stderr": 0.031557828165561644 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7512437810945274, "acc_stderr": 0.030567675938916718, "acc_norm": 0.7512437810945274, "acc_norm_stderr": 0.030567675938916718 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.8, "acc_stderr": 0.04020151261036846, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-virology|5": { "acc": 0.43373493975903615, "acc_stderr": 0.03858158940685516, "acc_norm": 0.43373493975903615, "acc_norm_stderr": 0.03858158940685516 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7719298245614035, "acc_stderr": 0.03218093795602357, "acc_norm": 0.7719298245614035, "acc_norm_stderr": 0.03218093795602357 }, "harness|truthfulqa:mc|0": { "mc1": 0.35006119951040393, "mc1_stderr": 0.016697949420151032, "mc2": 0.505550629963065, "mc2_stderr": 0.01590974900800537 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
YoungPhlo/openchat-sharegpt_gpt4_standardized
2023-09-05T02:45:48.000Z
[ "region:us" ]
YoungPhlo
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_Undi95__LewdEngine
2023-09-05T02:57:43.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Undi95/LewdEngine dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Undi95/LewdEngine](https://huggingface.co/Undi95/LewdEngine) on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__LewdEngine\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-09-05T02:56:23.442470](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__LewdEngine/blob/main/results_2023-09-05T02%3A56%3A23.442470.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5500083344336444,\n\ \ \"acc_stderr\": 0.0345125155387716,\n \"acc_norm\": 0.5541685269228124,\n\ \ \"acc_norm_stderr\": 0.034490519380335635,\n \"mc1\": 0.3047735618115055,\n\ \ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.43629332146485117,\n\ \ \"mc2_stderr\": 0.014738333697751311\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5571672354948806,\n \"acc_stderr\": 0.014515573873348902,\n\ \ \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.014285898292938165\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6331408086038638,\n\ \ \"acc_stderr\": 0.004809626723626823,\n \"acc_norm\": 0.8308105954989046,\n\ \ \"acc_norm_stderr\": 0.0037415289563158417\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\ \ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\ \ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.040657710025626036,\n\ \ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.040657710025626036\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\ \ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03015113445777629,\n \ \ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03015113445777629\n },\n\ \ \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\ \ \"acc_stderr\": 0.04101405519842425,\n \"acc_norm\": 0.5972222222222222,\n\ \ \"acc_norm_stderr\": 0.04101405519842425\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n\ \ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\ \ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.4682080924855491,\n\ \ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087764,\n\ \ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087764\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\ \ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\ \ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\ \ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\ \ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\ \ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.335978835978836,\n \"acc_stderr\": 0.024326310529149138,\n \"\ acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149138\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\ \ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\ \ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\ \ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\ \ \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n\ \ \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\ \ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\ : 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\ \ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\ acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624528,\n\ \ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624528\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5076923076923077,\n \"acc_stderr\": 0.025348006031534778,\n\ \ \"acc_norm\": 0.5076923076923077,\n \"acc_norm_stderr\": 0.025348006031534778\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \ \ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\ \ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\ acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7155963302752294,\n \"acc_stderr\": 0.01934203658770259,\n \"\ acc_norm\": 0.7155963302752294,\n \"acc_norm_stderr\": 0.01934203658770259\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4166666666666667,\n \"acc_stderr\": 0.033622774366080445,\n \"\ acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.033622774366080445\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501943,\n \"\ acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501943\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842538,\n \ \ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842538\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\ \ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\ \ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\ \ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.71900826446281,\n \"acc_stderr\": 0.041032038305145124,\n \"acc_norm\"\ : 0.71900826446281,\n \"acc_norm_stderr\": 0.041032038305145124\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\ \ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n\ \ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\ \ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\ \ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\ \ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\ \ \"acc_stderr\": 0.026655699653922737,\n \"acc_norm\": 0.7905982905982906,\n\ \ \"acc_norm_stderr\": 0.026655699653922737\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \ \ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7292464878671775,\n\ \ \"acc_stderr\": 0.015889888362560483,\n \"acc_norm\": 0.7292464878671775,\n\ \ \"acc_norm_stderr\": 0.015889888362560483\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.025862201852277906,\n\ \ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.025862201852277906\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n\ \ \"acc_stderr\": 0.015414494487903219,\n \"acc_norm\": 0.30614525139664805,\n\ \ \"acc_norm_stderr\": 0.015414494487903219\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602663,\n\ \ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602663\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\ \ \"acc_stderr\": 0.027559949802347817,\n \"acc_norm\": 0.6205787781350482,\n\ \ \"acc_norm_stderr\": 0.027559949802347817\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.02723741509459248,\n\ \ \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.02723741509459248\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \ \ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\ \ \"acc_stderr\": 0.012604960816087378,\n \"acc_norm\": 0.4198174706649283,\n\ \ \"acc_norm_stderr\": 0.012604960816087378\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\ \ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5424836601307189,\n \"acc_stderr\": 0.020154685712590884,\n \ \ \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.020154685712590884\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\ \ \"acc_stderr\": 0.04673752333670238,\n \"acc_norm\": 0.6090909090909091,\n\ \ \"acc_norm_stderr\": 0.04673752333670238\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087558,\n\ \ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087558\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\ \ \"acc_stderr\": 0.03220024104534205,\n \"acc_norm\": 0.7064676616915423,\n\ \ \"acc_norm_stderr\": 0.03220024104534205\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\ \ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\ \ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\ \ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n\ \ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.43629332146485117,\n\ \ \"mc2_stderr\": 0.014738333697751311\n }\n}\n```" repo_url: https://huggingface.co/Undi95/LewdEngine leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|arc:challenge|25_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hellaswag|10_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T02:56:23.442470.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:56:23.442470.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_05T02_56_23.442470 path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T02:56:23.442470.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T02:56:23.442470.parquet' - config_name: results data_files: - split: 2023_09_05T02_56_23.442470 path: - results_2023-09-05T02:56:23.442470.parquet - split: latest path: - results_2023-09-05T02:56:23.442470.parquet --- # Dataset Card for Evaluation run of Undi95/LewdEngine ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Undi95/LewdEngine - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Undi95/LewdEngine](https://huggingface.co/Undi95/LewdEngine) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Undi95__LewdEngine", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-05T02:56:23.442470](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__LewdEngine/blob/main/results_2023-09-05T02%3A56%3A23.442470.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5500083344336444, "acc_stderr": 0.0345125155387716, "acc_norm": 0.5541685269228124, "acc_norm_stderr": 0.034490519380335635, "mc1": 0.3047735618115055, "mc1_stderr": 0.016114124156882455, "mc2": 0.43629332146485117, "mc2_stderr": 0.014738333697751311 }, "harness|arc:challenge|25": { "acc": 0.5571672354948806, "acc_stderr": 0.014515573873348902, "acc_norm": 0.6049488054607508, "acc_norm_stderr": 0.014285898292938165 }, "harness|hellaswag|10": { "acc": 0.6331408086038638, "acc_stderr": 0.004809626723626823, "acc_norm": 0.8308105954989046, "acc_norm_stderr": 0.0037415289563158417 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750574, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750574 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5197368421052632, "acc_stderr": 0.040657710025626036, "acc_norm": 0.5197368421052632, "acc_norm_stderr": 0.040657710025626036 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6, "acc_stderr": 0.03015113445777629, "acc_norm": 0.6, "acc_norm_stderr": 0.03015113445777629 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5972222222222222, "acc_stderr": 0.04101405519842425, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.04101405519842425 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4682080924855491, "acc_stderr": 0.03804749744364764, "acc_norm": 0.4682080924855491, "acc_norm_stderr": 0.03804749744364764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2647058823529412, "acc_stderr": 0.043898699568087764, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.043898699568087764 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.047258156262526094, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526094 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.425531914893617, "acc_stderr": 0.03232146916224468, "acc_norm": 0.425531914893617, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322004, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322004 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.47586206896551725, "acc_stderr": 0.041618085035015295, "acc_norm": 0.47586206896551725, "acc_norm_stderr": 0.041618085035015295 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.335978835978836, "acc_stderr": 0.024326310529149138, "acc_norm": 0.335978835978836, "acc_norm_stderr": 0.024326310529149138 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.041905964388711366, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.041905964388711366 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6483870967741936, "acc_stderr": 0.02716253782694846, "acc_norm": 0.6483870967741936, "acc_norm_stderr": 0.02716253782694846 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4630541871921182, "acc_stderr": 0.035083705204426656, "acc_norm": 0.4630541871921182, "acc_norm_stderr": 0.035083705204426656 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6848484848484848, "acc_stderr": 0.0362773057502241, "acc_norm": 0.6848484848484848, "acc_norm_stderr": 0.0362773057502241 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6868686868686869, "acc_stderr": 0.033042050878136525, "acc_norm": 0.6868686868686869, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8082901554404145, "acc_stderr": 0.02840895362624528, "acc_norm": 0.8082901554404145, "acc_norm_stderr": 0.02840895362624528 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5076923076923077, "acc_stderr": 0.025348006031534778, "acc_norm": 0.5076923076923077, "acc_norm_stderr": 0.025348006031534778 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.02822644674968352, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.02822644674968352 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5756302521008403, "acc_stderr": 0.032104790510157764, "acc_norm": 0.5756302521008403, "acc_norm_stderr": 0.032104790510157764 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7155963302752294, "acc_stderr": 0.01934203658770259, "acc_norm": 0.7155963302752294, "acc_norm_stderr": 0.01934203658770259 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4166666666666667, "acc_stderr": 0.033622774366080445, "acc_norm": 0.4166666666666667, "acc_norm_stderr": 0.033622774366080445 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7549019607843137, "acc_stderr": 0.030190282453501943, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.030190282453501943 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7215189873417721, "acc_stderr": 0.029178682304842538, "acc_norm": 0.7215189873417721, "acc_norm_stderr": 0.029178682304842538 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6564885496183206, "acc_stderr": 0.041649760719448786, "acc_norm": 0.6564885496183206, "acc_norm_stderr": 0.041649760719448786 }, "harness|hendrycksTest-international_law|5": { "acc": 0.71900826446281, "acc_stderr": 0.041032038305145124, "acc_norm": 0.71900826446281, "acc_norm_stderr": 0.041032038305145124 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7129629629629629, "acc_stderr": 0.04373313040914761, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.04373313040914761 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6809815950920245, "acc_stderr": 0.03661997551073836, "acc_norm": 0.6809815950920245, "acc_norm_stderr": 0.03661997551073836 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.29464285714285715, "acc_stderr": 0.04327040932578729, "acc_norm": 0.29464285714285715, "acc_norm_stderr": 0.04327040932578729 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7905982905982906, "acc_stderr": 0.026655699653922737, "acc_norm": 0.7905982905982906, "acc_norm_stderr": 0.026655699653922737 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.52, "acc_stderr": 0.05021167315686779, "acc_norm": 0.52, "acc_norm_stderr": 0.05021167315686779 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7292464878671775, "acc_stderr": 0.015889888362560483, "acc_norm": 0.7292464878671775, "acc_norm_stderr": 0.015889888362560483 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.638728323699422, "acc_stderr": 0.025862201852277906, "acc_norm": 0.638728323699422, "acc_norm_stderr": 0.025862201852277906 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.30614525139664805, "acc_stderr": 0.015414494487903219, "acc_norm": 0.30614525139664805, "acc_norm_stderr": 0.015414494487903219 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.630718954248366, "acc_stderr": 0.027634176689602663, "acc_norm": 0.630718954248366, "acc_norm_stderr": 0.027634176689602663 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6205787781350482, "acc_stderr": 0.027559949802347817, "acc_norm": 0.6205787781350482, "acc_norm_stderr": 0.027559949802347817 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6018518518518519, "acc_stderr": 0.02723741509459248, "acc_norm": 0.6018518518518519, "acc_norm_stderr": 0.02723741509459248 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.42907801418439717, "acc_stderr": 0.02952591430255856, "acc_norm": 0.42907801418439717, "acc_norm_stderr": 0.02952591430255856 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4198174706649283, "acc_stderr": 0.012604960816087378, "acc_norm": 0.4198174706649283, "acc_norm_stderr": 0.012604960816087378 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5183823529411765, "acc_stderr": 0.030352303395351964, "acc_norm": 0.5183823529411765, "acc_norm_stderr": 0.030352303395351964 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5424836601307189, "acc_stderr": 0.020154685712590884, "acc_norm": 0.5424836601307189, "acc_norm_stderr": 0.020154685712590884 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6090909090909091, "acc_stderr": 0.04673752333670238, "acc_norm": 0.6090909090909091, "acc_norm_stderr": 0.04673752333670238 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6326530612244898, "acc_stderr": 0.030862144921087558, "acc_norm": 0.6326530612244898, "acc_norm_stderr": 0.030862144921087558 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7064676616915423, "acc_stderr": 0.03220024104534205, "acc_norm": 0.7064676616915423, "acc_norm_stderr": 0.03220024104534205 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.42168674698795183, "acc_stderr": 0.03844453181770917, "acc_norm": 0.42168674698795183, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7602339181286549, "acc_stderr": 0.03274485211946956, "acc_norm": 0.7602339181286549, "acc_norm_stderr": 0.03274485211946956 }, "harness|truthfulqa:mc|0": { "mc1": 0.3047735618115055, "mc1_stderr": 0.016114124156882455, "mc2": 0.43629332146485117, "mc2_stderr": 0.014738333697751311 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
jskinner215/multi_kaggle_churn
2023-09-05T04:08:18.000Z
[ "region:us" ]
jskinner215
null
null
null
0
0
Entry not found
yan894632016/yan2
2023-09-05T03:06:28.000Z
[ "region:us" ]
yan894632016
null
null
null
0
0
Entry not found
Aminael/test-veinte
2023-09-05T03:07:43.000Z
[ "region:us" ]
Aminael
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft
2023-09-05T03:16:52.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Mikivis/gpt2-large-lora-sft dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Mikivis/gpt2-large-lora-sft](https://huggingface.co/Mikivis/gpt2-large-lora-sft)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-09-05T03:15:39.228135](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft/blob/main/results_2023-09-05T03%3A15%3A39.228135.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2595637781428506,\n\ \ \"acc_stderr\": 0.03171942417535736,\n \"acc_norm\": 0.2614937356060542,\n\ \ \"acc_norm_stderr\": 0.03173097852093505,\n \"mc1\": 0.23623011015911874,\n\ \ \"mc1_stderr\": 0.014869755015871114,\n \"mc2\": 0.39061720011720086,\n\ \ \"mc2_stderr\": 0.014622336164598085\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.23720136518771331,\n \"acc_stderr\": 0.01243039982926084,\n\ \ \"acc_norm\": 0.26791808873720135,\n \"acc_norm_stderr\": 0.012942030195136433\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.35839474208325034,\n\ \ \"acc_stderr\": 0.004785488626807568,\n \"acc_norm\": 0.4415455088627763,\n\ \ \"acc_norm_stderr\": 0.00495556465001617\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\ \ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\ \ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\ \ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\ \ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708094,\n\ \ \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708094\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\ \ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\ \ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n\ \ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\ \ \"acc_stderr\": 0.033450369167889925,\n \"acc_norm\": 0.26011560693641617,\n\ \ \"acc_norm_stderr\": 0.033450369167889925\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.13725490196078433,\n \"acc_stderr\": 0.034240846698915216,\n\ \ \"acc_norm\": 0.13725490196078433,\n \"acc_norm_stderr\": 0.034240846698915216\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.0281854413012341,\n\ \ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.0281854413012341\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\ \ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\ \ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\ \ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.24074074074074073,\n \"acc_stderr\": 0.022019080012217897,\n \"\ acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.022019080012217897\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\ \ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\ \ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \ \ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.2838709677419355,\n \"acc_stderr\": 0.025649381063029265,\n \"\ acc_norm\": 0.2838709677419355,\n \"acc_norm_stderr\": 0.025649381063029265\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427496,\n \"\ acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427496\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\ : 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n\ \ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.30303030303030304,\n \"acc_stderr\": 0.032742879140268674,\n \"\ acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.032742879140268674\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n\ \ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128006,\n\ \ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128006\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \ \ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.026653531596715477,\n\ \ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.026653531596715477\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\ acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.26788990825688075,\n \"acc_stderr\": 0.01898746225797865,\n \"\ acc_norm\": 0.26788990825688075,\n \"acc_norm_stderr\": 0.01898746225797865\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.42592592592592593,\n \"acc_stderr\": 0.033723432716530624,\n \"\ acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.033723432716530624\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350195,\n \"\ acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350195\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293433,\n \ \ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293433\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.242152466367713,\n\ \ \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.242152466367713,\n\ \ \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596919,\n\ \ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596919\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.32231404958677684,\n \"acc_stderr\": 0.04266416363352168,\n \"\ acc_norm\": 0.32231404958677684,\n \"acc_norm_stderr\": 0.04266416363352168\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\ \ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \ \ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\ \ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\ \ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\ \ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.39805825242718446,\n \"acc_stderr\": 0.0484674825397724,\n\ \ \"acc_norm\": 0.39805825242718446,\n \"acc_norm_stderr\": 0.0484674825397724\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2264957264957265,\n\ \ \"acc_stderr\": 0.027421007295392926,\n \"acc_norm\": 0.2264957264957265,\n\ \ \"acc_norm_stderr\": 0.027421007295392926\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \ \ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25798212005108556,\n\ \ \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.25798212005108556,\n\ \ \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.024105712607754307,\n\ \ \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.024105712607754307\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\ \ \"acc_stderr\": 0.014378169884098414,\n \"acc_norm\": 0.2446927374301676,\n\ \ \"acc_norm_stderr\": 0.014378169884098414\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\ \ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2057877813504823,\n\ \ \"acc_stderr\": 0.022961339906764234,\n \"acc_norm\": 0.2057877813504823,\n\ \ \"acc_norm_stderr\": 0.022961339906764234\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.024477222856135114,\n\ \ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.024477222856135114\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.22340425531914893,\n \"acc_stderr\": 0.024847921358063962,\n \ \ \"acc_norm\": 0.22340425531914893,\n \"acc_norm_stderr\": 0.024847921358063962\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\ \ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\ \ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.027257202606114944,\n\ \ \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.027257202606114944\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132226,\n \ \ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132226\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.18181818181818182,\n\ \ \"acc_stderr\": 0.03694284335337802,\n \"acc_norm\": 0.18181818181818182,\n\ \ \"acc_norm_stderr\": 0.03694284335337802\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.027682979522960234,\n\ \ \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.027682979522960234\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2835820895522388,\n\ \ \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.2835820895522388,\n\ \ \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\ \ \"acc_stderr\": 0.03384429155233133,\n \"acc_norm\": 0.25301204819277107,\n\ \ \"acc_norm_stderr\": 0.03384429155233133\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\ \ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\ \ \"mc1_stderr\": 0.014869755015871114,\n \"mc2\": 0.39061720011720086,\n\ \ \"mc2_stderr\": 0.014622336164598085\n }\n}\n```" repo_url: https://huggingface.co/Mikivis/gpt2-large-lora-sft leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|arc:challenge|25_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hellaswag|10_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T03:15:39.228135.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T03:15:39.228135.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_05T03_15_39.228135 path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T03:15:39.228135.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T03:15:39.228135.parquet' - config_name: results data_files: - split: 2023_09_05T03_15_39.228135 path: - results_2023-09-05T03:15:39.228135.parquet - split: latest path: - results_2023-09-05T03:15:39.228135.parquet --- # Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-sft ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Mikivis/gpt2-large-lora-sft - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Mikivis/gpt2-large-lora-sft](https://huggingface.co/Mikivis/gpt2-large-lora-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-05T03:15:39.228135](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft/blob/main/results_2023-09-05T03%3A15%3A39.228135.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2595637781428506, "acc_stderr": 0.03171942417535736, "acc_norm": 0.2614937356060542, "acc_norm_stderr": 0.03173097852093505, "mc1": 0.23623011015911874, "mc1_stderr": 0.014869755015871114, "mc2": 0.39061720011720086, "mc2_stderr": 0.014622336164598085 }, "harness|arc:challenge|25": { "acc": 0.23720136518771331, "acc_stderr": 0.01243039982926084, "acc_norm": 0.26791808873720135, "acc_norm_stderr": 0.012942030195136433 }, "harness|hellaswag|10": { "acc": 0.35839474208325034, "acc_stderr": 0.004785488626807568, "acc_norm": 0.4415455088627763, "acc_norm_stderr": 0.00495556465001617 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04072314811876837, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04072314811876837 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.26037735849056604, "acc_stderr": 0.027008766090708094, "acc_norm": 0.26037735849056604, "acc_norm_stderr": 0.027008766090708094 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.26011560693641617, "acc_stderr": 0.033450369167889925, "acc_norm": 0.26011560693641617, "acc_norm_stderr": 0.033450369167889925 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.13725490196078433, "acc_stderr": 0.034240846698915216, "acc_norm": 0.13725490196078433, "acc_norm_stderr": 0.034240846698915216 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.24680851063829787, "acc_stderr": 0.0281854413012341, "acc_norm": 0.24680851063829787, "acc_norm_stderr": 0.0281854413012341 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.03999423879281336, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.03999423879281336 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.25517241379310346, "acc_stderr": 0.03632984052707842, "acc_norm": 0.25517241379310346, "acc_norm_stderr": 0.03632984052707842 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.022019080012217897, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.022019080012217897 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.15079365079365079, "acc_stderr": 0.03200686497287392, "acc_norm": 0.15079365079365079, "acc_norm_stderr": 0.03200686497287392 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.19, "acc_stderr": 0.039427724440366234, "acc_norm": 0.19, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2838709677419355, "acc_stderr": 0.025649381063029265, "acc_norm": 0.2838709677419355, "acc_norm_stderr": 0.025649381063029265 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2857142857142857, "acc_stderr": 0.031785297106427496, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.031785297106427496 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21212121212121213, "acc_stderr": 0.03192271569548299, "acc_norm": 0.21212121212121213, "acc_norm_stderr": 0.03192271569548299 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.30303030303030304, "acc_stderr": 0.032742879140268674, "acc_norm": 0.30303030303030304, "acc_norm_stderr": 0.032742879140268674 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.22279792746113988, "acc_stderr": 0.03003114797764154, "acc_norm": 0.22279792746113988, "acc_norm_stderr": 0.03003114797764154 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2128205128205128, "acc_stderr": 0.020752423722128006, "acc_norm": 0.2128205128205128, "acc_norm_stderr": 0.020752423722128006 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.29259259259259257, "acc_stderr": 0.02773896963217609, "acc_norm": 0.29259259259259257, "acc_norm_stderr": 0.02773896963217609 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21428571428571427, "acc_stderr": 0.026653531596715477, "acc_norm": 0.21428571428571427, "acc_norm_stderr": 0.026653531596715477 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.23841059602649006, "acc_stderr": 0.0347918557259966, "acc_norm": 0.23841059602649006, "acc_norm_stderr": 0.0347918557259966 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.26788990825688075, "acc_stderr": 0.01898746225797865, "acc_norm": 0.26788990825688075, "acc_norm_stderr": 0.01898746225797865 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.42592592592592593, "acc_stderr": 0.033723432716530624, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.033723432716530624 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24509803921568626, "acc_stderr": 0.03019028245350195, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.03019028245350195 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293433, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293433 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.242152466367713, "acc_stderr": 0.028751392398694755, "acc_norm": 0.242152466367713, "acc_norm_stderr": 0.028751392398694755 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2366412213740458, "acc_stderr": 0.03727673575596919, "acc_norm": 0.2366412213740458, "acc_norm_stderr": 0.03727673575596919 }, "harness|hendrycksTest-international_law|5": { "acc": 0.32231404958677684, "acc_stderr": 0.04266416363352168, "acc_norm": 0.32231404958677684, "acc_norm_stderr": 0.04266416363352168 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25, "acc_stderr": 0.04186091791394607, "acc_norm": 0.25, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.3006134969325153, "acc_stderr": 0.03602511318806771, "acc_norm": 0.3006134969325153, "acc_norm_stderr": 0.03602511318806771 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.30357142857142855, "acc_stderr": 0.04364226155841044, "acc_norm": 0.30357142857142855, "acc_norm_stderr": 0.04364226155841044 }, "harness|hendrycksTest-management|5": { "acc": 0.39805825242718446, "acc_stderr": 0.0484674825397724, "acc_norm": 0.39805825242718446, "acc_norm_stderr": 0.0484674825397724 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2264957264957265, "acc_stderr": 0.027421007295392926, "acc_norm": 0.2264957264957265, "acc_norm_stderr": 0.027421007295392926 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.22, "acc_stderr": 0.04163331998932269, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932269 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.25798212005108556, "acc_stderr": 0.01564583018834895, "acc_norm": 0.25798212005108556, "acc_norm_stderr": 0.01564583018834895 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2774566473988439, "acc_stderr": 0.024105712607754307, "acc_norm": 0.2774566473988439, "acc_norm_stderr": 0.024105712607754307 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2446927374301676, "acc_stderr": 0.014378169884098414, "acc_norm": 0.2446927374301676, "acc_norm_stderr": 0.014378169884098414 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.25163398692810457, "acc_stderr": 0.024848018263875195, "acc_norm": 0.25163398692810457, "acc_norm_stderr": 0.024848018263875195 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2057877813504823, "acc_stderr": 0.022961339906764234, "acc_norm": 0.2057877813504823, "acc_norm_stderr": 0.022961339906764234 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2623456790123457, "acc_stderr": 0.024477222856135114, "acc_norm": 0.2623456790123457, "acc_norm_stderr": 0.024477222856135114 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.22340425531914893, "acc_stderr": 0.024847921358063962, "acc_norm": 0.22340425531914893, "acc_norm_stderr": 0.024847921358063962 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.27941176470588236, "acc_stderr": 0.027257202606114944, "acc_norm": 0.27941176470588236, "acc_norm_stderr": 0.027257202606114944 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25163398692810457, "acc_stderr": 0.01755581809132226, "acc_norm": 0.25163398692810457, "acc_norm_stderr": 0.01755581809132226 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.18181818181818182, "acc_stderr": 0.03694284335337802, "acc_norm": 0.18181818181818182, "acc_norm_stderr": 0.03694284335337802 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.24897959183673468, "acc_stderr": 0.027682979522960234, "acc_norm": 0.24897959183673468, "acc_norm_stderr": 0.027682979522960234 }, "harness|hendrycksTest-sociology|5": { "acc": 0.2835820895522388, "acc_stderr": 0.03187187537919797, "acc_norm": 0.2835820895522388, "acc_norm_stderr": 0.03187187537919797 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-virology|5": { "acc": 0.25301204819277107, "acc_stderr": 0.03384429155233133, "acc_norm": 0.25301204819277107, "acc_norm_stderr": 0.03384429155233133 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2807017543859649, "acc_stderr": 0.034462962170884265, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.034462962170884265 }, "harness|truthfulqa:mc|0": { "mc1": 0.23623011015911874, "mc1_stderr": 0.014869755015871114, "mc2": 0.39061720011720086, "mc2_stderr": 0.014622336164598085 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
meandyou200175/Audio_Kien
2023-10-10T04:22:32.000Z
[ "region:us" ]
meandyou200175
null
null
null
0
0
Entry not found
markmp/marketing_email_test
2023-09-05T03:38:29.000Z
[ "region:us" ]
markmp
null
null
null
0
0
--- dataset_info: features: - name: product dtype: string - name: description dtype: string - name: marketing_email dtype: string splits: - name: train num_bytes: 13830 num_examples: 10 download_size: 18502 dataset_size: 13830 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "marketing_email_test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
krishanusinha20/marketing_emails
2023-09-05T03:40:26.000Z
[ "region:us" ]
krishanusinha20
null
null
null
0
0
--- dataset_info: features: - name: product dtype: string - name: description dtype: string - name: marketing_email dtype: string splits: - name: train num_bytes: 20941 num_examples: 10 download_size: 26509 dataset_size: 20941 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "marketing_emails" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-airoboros-13b-0.10e
2023-09-12T14:43:38.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of uukuguy/speechless-codellama-orca-airoboros-13b-0.10e dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [uukuguy/speechless-codellama-orca-airoboros-13b-0.10e](https://huggingface.co/uukuguy/speechless-codellama-orca-airoboros-13b-0.10e)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-airoboros-13b-0.10e\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-09-12T14:42:21.510480](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-airoboros-13b-0.10e/blob/main/results_2023-09-12T14-42-21.510480.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2542111825994659,\n\ \ \"acc_stderr\": 0.03135387998985871,\n \"acc_norm\": 0.2549947631985964,\n\ \ \"acc_norm_stderr\": 0.03136581756825153,\n \"mc1\": 0.2423500611995104,\n\ \ \"mc1_stderr\": 0.015000674373570342,\n \"mc2\": 0.4963684639744407,\n\ \ \"mc2_stderr\": 0.016496718744347754\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.24744027303754265,\n \"acc_stderr\": 0.012610352663292673,\n\ \ \"acc_norm\": 0.29436860068259385,\n \"acc_norm_stderr\": 0.013318528460539427\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25781716789484166,\n\ \ \"acc_stderr\": 0.004365388351563103,\n \"acc_norm\": 0.25712009559848636,\n\ \ \"acc_norm_stderr\": 0.004361529679492747\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653696,\n \ \ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653696\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\ \ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\ \ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\ \ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\ \ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \ \ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\ \ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\ \ \"acc_stderr\": 0.034370793441061344,\n \"acc_norm\": 0.2152777777777778,\n\ \ \"acc_norm_stderr\": 0.034370793441061344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\ : 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\ \ \"acc_stderr\": 0.03126511206173042,\n \"acc_norm\": 0.2138728323699422,\n\ \ \"acc_norm_stderr\": 0.03126511206173042\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\ \ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n\ \ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\ \ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\ \ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\ \ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\ \ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\ acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\ \ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\ \ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n\ \ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n\ \ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\ \ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\ : 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\ \ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.31313131313131315,\n \"acc_stderr\": 0.033042050878136525,\n \"\ acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.033042050878136525\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\ \ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2282051282051282,\n \"acc_stderr\": 0.02127839386358628,\n \ \ \"acc_norm\": 0.2282051282051282,\n \"acc_norm_stderr\": 0.02127839386358628\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \ \ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \ \ \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\ : 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\ \ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23669724770642203,\n\ \ \"acc_stderr\": 0.01822407811729908,\n \"acc_norm\": 0.23669724770642203,\n\ \ \"acc_norm_stderr\": 0.01822407811729908\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\ : {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.025416428388767478,\n\ \ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.025416428388767478\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\ acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.20253164556962025,\n \"acc_stderr\": 0.026160568246601457,\n \ \ \"acc_norm\": 0.20253164556962025,\n \"acc_norm_stderr\": 0.026160568246601457\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n\ \ \"acc_stderr\": 0.020799400082879997,\n \"acc_norm\": 0.10762331838565023,\n\ \ \"acc_norm_stderr\": 0.020799400082879997\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\ \ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"\ acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\ \ \"acc_stderr\": 0.040774947092526284,\n \"acc_norm\": 0.23148148148148148,\n\ \ \"acc_norm_stderr\": 0.040774947092526284\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n\ \ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n\ \ \"acc_stderr\": 0.038946411200447915,\n \"acc_norm\": 0.21428571428571427,\n\ \ \"acc_norm_stderr\": 0.038946411200447915\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\ \ \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n\ \ \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\ \ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25287356321839083,\n\ \ \"acc_stderr\": 0.015543377313719678,\n \"acc_norm\": 0.25287356321839083,\n\ \ \"acc_norm_stderr\": 0.015543377313719678\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\ \ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\ \ \"acc_stderr\": 0.014816119635317005,\n \"acc_norm\": 0.2681564245810056,\n\ \ \"acc_norm_stderr\": 0.014816119635317005\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912258,\n\ \ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912258\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\ \ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\ \ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\ \ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340460997,\n \ \ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340460997\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n\ \ \"acc_stderr\": 0.010976425013113886,\n \"acc_norm\": 0.24445893089960888,\n\ \ \"acc_norm_stderr\": 0.010976425013113886\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\ \ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.2679738562091503,\n \"acc_stderr\": 0.017917974069594722,\n \ \ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.017917974069594722\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\ \ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\ \ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\ \ \"acc_stderr\": 0.03170056183497308,\n \"acc_norm\": 0.27860696517412936,\n\ \ \"acc_norm_stderr\": 0.03170056183497308\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368466,\n \ \ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368466\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\ \ \"acc_stderr\": 0.03384429155233134,\n \"acc_norm\": 0.25301204819277107,\n\ \ \"acc_norm_stderr\": 0.03384429155233134\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\ \ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\ \ \"mc1_stderr\": 0.015000674373570342,\n \"mc2\": 0.4963684639744407,\n\ \ \"mc2_stderr\": 0.016496718744347754\n }\n}\n```" repo_url: https://huggingface.co/uukuguy/speechless-codellama-orca-airoboros-13b-0.10e leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|arc:challenge|25_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|arc:challenge|25_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hellaswag|10_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hellaswag|10_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T03:40:07.595318.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-42-21.510480.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-management|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-42-21.510480.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_05T03_40_07.595318 path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T03:40:07.595318.parquet' - split: 2023_09_12T14_42_21.510480 path: - '**/details_harness|truthfulqa:mc|0_2023-09-12T14-42-21.510480.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-12T14-42-21.510480.parquet' - config_name: results data_files: - split: 2023_09_05T03_40_07.595318 path: - results_2023-09-05T03:40:07.595318.parquet - split: 2023_09_12T14_42_21.510480 path: - results_2023-09-12T14-42-21.510480.parquet - split: latest path: - results_2023-09-12T14-42-21.510480.parquet --- # Dataset Card for Evaluation run of uukuguy/speechless-codellama-orca-airoboros-13b-0.10e ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/uukuguy/speechless-codellama-orca-airoboros-13b-0.10e - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-orca-airoboros-13b-0.10e](https://huggingface.co/uukuguy/speechless-codellama-orca-airoboros-13b-0.10e) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-airoboros-13b-0.10e", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-12T14:42:21.510480](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-airoboros-13b-0.10e/blob/main/results_2023-09-12T14-42-21.510480.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2542111825994659, "acc_stderr": 0.03135387998985871, "acc_norm": 0.2549947631985964, "acc_norm_stderr": 0.03136581756825153, "mc1": 0.2423500611995104, "mc1_stderr": 0.015000674373570342, "mc2": 0.4963684639744407, "mc2_stderr": 0.016496718744347754 }, "harness|arc:challenge|25": { "acc": 0.24744027303754265, "acc_stderr": 0.012610352663292673, "acc_norm": 0.29436860068259385, "acc_norm_stderr": 0.013318528460539427 }, "harness|hellaswag|10": { "acc": 0.25781716789484166, "acc_stderr": 0.004365388351563103, "acc_norm": 0.25712009559848636, "acc_norm_stderr": 0.004361529679492747 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.18, "acc_stderr": 0.03861229196653696, "acc_norm": 0.18, "acc_norm_stderr": 0.03861229196653696 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2518518518518518, "acc_stderr": 0.037498507091740206, "acc_norm": 0.2518518518518518, "acc_norm_stderr": 0.037498507091740206 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.18421052631578946, "acc_stderr": 0.0315469804508223, "acc_norm": 0.18421052631578946, "acc_norm_stderr": 0.0315469804508223 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2679245283018868, "acc_stderr": 0.027257260322494845, "acc_norm": 0.2679245283018868, "acc_norm_stderr": 0.027257260322494845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2152777777777778, "acc_stderr": 0.034370793441061344, "acc_norm": 0.2152777777777778, "acc_norm_stderr": 0.034370793441061344 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2138728323699422, "acc_stderr": 0.03126511206173042, "acc_norm": 0.2138728323699422, "acc_norm_stderr": 0.03126511206173042 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.19607843137254902, "acc_stderr": 0.03950581861179961, "acc_norm": 0.19607843137254902, "acc_norm_stderr": 0.03950581861179961 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.32340425531914896, "acc_stderr": 0.030579442773610334, "acc_norm": 0.32340425531914896, "acc_norm_stderr": 0.030579442773610334 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.04227054451232199, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.04227054451232199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.22758620689655173, "acc_stderr": 0.03493950380131184, "acc_norm": 0.22758620689655173, "acc_norm_stderr": 0.03493950380131184 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2566137566137566, "acc_stderr": 0.022494510767503154, "acc_norm": 0.2566137566137566, "acc_norm_stderr": 0.022494510767503154 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.041905964388711366, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.041905964388711366 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.25483870967741934, "acc_stderr": 0.024790118459332208, "acc_norm": 0.25483870967741934, "acc_norm_stderr": 0.024790118459332208 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2955665024630542, "acc_stderr": 0.032104944337514575, "acc_norm": 0.2955665024630542, "acc_norm_stderr": 0.032104944337514575 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.19, "acc_stderr": 0.039427724440366234, "acc_norm": 0.19, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2545454545454545, "acc_stderr": 0.03401506715249039, "acc_norm": 0.2545454545454545, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.31313131313131315, "acc_stderr": 0.033042050878136525, "acc_norm": 0.31313131313131315, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.36787564766839376, "acc_stderr": 0.03480175668466036, "acc_norm": 0.36787564766839376, "acc_norm_stderr": 0.03480175668466036 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2282051282051282, "acc_stderr": 0.02127839386358628, "acc_norm": 0.2282051282051282, "acc_norm_stderr": 0.02127839386358628 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.026842057873833706, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.026842057873833706 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3487394957983193, "acc_stderr": 0.03095663632856655, "acc_norm": 0.3487394957983193, "acc_norm_stderr": 0.03095663632856655 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.271523178807947, "acc_stderr": 0.03631329803969653, "acc_norm": 0.271523178807947, "acc_norm_stderr": 0.03631329803969653 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.23669724770642203, "acc_stderr": 0.01822407811729908, "acc_norm": 0.23669724770642203, "acc_norm_stderr": 0.01822407811729908 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.16666666666666666, "acc_stderr": 0.025416428388767478, "acc_norm": 0.16666666666666666, "acc_norm_stderr": 0.025416428388767478 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.2549019607843137, "acc_stderr": 0.030587591351604246, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.030587591351604246 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.20253164556962025, "acc_stderr": 0.026160568246601457, "acc_norm": 0.20253164556962025, "acc_norm_stderr": 0.026160568246601457 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.10762331838565023, "acc_stderr": 0.020799400082879997, "acc_norm": 0.10762331838565023, "acc_norm_stderr": 0.020799400082879997 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2824427480916031, "acc_stderr": 0.03948406125768361, "acc_norm": 0.2824427480916031, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.14049586776859505, "acc_stderr": 0.03172233426002161, "acc_norm": 0.14049586776859505, "acc_norm_stderr": 0.03172233426002161 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.23148148148148148, "acc_stderr": 0.040774947092526284, "acc_norm": 0.23148148148148148, "acc_norm_stderr": 0.040774947092526284 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22699386503067484, "acc_stderr": 0.032910995786157686, "acc_norm": 0.22699386503067484, "acc_norm_stderr": 0.032910995786157686 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.21428571428571427, "acc_stderr": 0.038946411200447915, "acc_norm": 0.21428571428571427, "acc_norm_stderr": 0.038946411200447915 }, "harness|hendrycksTest-management|5": { "acc": 0.23300970873786409, "acc_stderr": 0.04185832598928315, "acc_norm": 0.23300970873786409, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2564102564102564, "acc_stderr": 0.028605953702004253, "acc_norm": 0.2564102564102564, "acc_norm_stderr": 0.028605953702004253 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.25287356321839083, "acc_stderr": 0.015543377313719678, "acc_norm": 0.25287356321839083, "acc_norm_stderr": 0.015543377313719678 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24566473988439305, "acc_stderr": 0.02317629820399201, "acc_norm": 0.24566473988439305, "acc_norm_stderr": 0.02317629820399201 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2681564245810056, "acc_stderr": 0.014816119635317005, "acc_norm": 0.2681564245810056, "acc_norm_stderr": 0.014816119635317005 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22875816993464052, "acc_stderr": 0.024051029739912258, "acc_norm": 0.22875816993464052, "acc_norm_stderr": 0.024051029739912258 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2733118971061093, "acc_stderr": 0.02531176597542612, "acc_norm": 0.2733118971061093, "acc_norm_stderr": 0.02531176597542612 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2654320987654321, "acc_stderr": 0.024569223600460845, "acc_norm": 0.2654320987654321, "acc_norm_stderr": 0.024569223600460845 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2765957446808511, "acc_stderr": 0.026684564340460997, "acc_norm": 0.2765957446808511, "acc_norm_stderr": 0.026684564340460997 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.24445893089960888, "acc_stderr": 0.010976425013113886, "acc_norm": 0.24445893089960888, "acc_norm_stderr": 0.010976425013113886 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4485294117647059, "acc_stderr": 0.030211479609121593, "acc_norm": 0.4485294117647059, "acc_norm_stderr": 0.030211479609121593 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2679738562091503, "acc_stderr": 0.017917974069594722, "acc_norm": 0.2679738562091503, "acc_norm_stderr": 0.017917974069594722 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.34545454545454546, "acc_stderr": 0.04554619617541054, "acc_norm": 0.34545454545454546, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.4, "acc_stderr": 0.031362502409358936, "acc_norm": 0.4, "acc_norm_stderr": 0.031362502409358936 }, "harness|hendrycksTest-sociology|5": { "acc": 0.27860696517412936, "acc_stderr": 0.03170056183497308, "acc_norm": 0.27860696517412936, "acc_norm_stderr": 0.03170056183497308 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.2, "acc_stderr": 0.040201512610368466, "acc_norm": 0.2, "acc_norm_stderr": 0.040201512610368466 }, "harness|hendrycksTest-virology|5": { "acc": 0.25301204819277107, "acc_stderr": 0.03384429155233134, "acc_norm": 0.25301204819277107, "acc_norm_stderr": 0.03384429155233134 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.21052631578947367, "acc_stderr": 0.0312678171466318, "acc_norm": 0.21052631578947367, "acc_norm_stderr": 0.0312678171466318 }, "harness|truthfulqa:mc|0": { "mc1": 0.2423500611995104, "mc1_stderr": 0.015000674373570342, "mc2": 0.4963684639744407, "mc2_stderr": 0.016496718744347754 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
tnash6/test
2023-09-05T03:42:18.000Z
[ "region:us" ]
tnash6
null
null
null
0
0
{"input": "What color is the sky?", "output": "The sky is blue."} {"input": "Where is the best place to get cloud GPUs?", "output": "Brev.dev"} {"input": "Why do Americans love guns so much?", "output": "Because of the Spanish."}
jschew39/generativeai_sample_data
2023-09-05T03:41:53.000Z
[ "region:us" ]
jschew39
null
null
null
0
0
--- dataset_info: features: - name: product dtype: string - name: description dtype: string - name: marketing_email dtype: string splits: - name: train num_bytes: 23408 num_examples: 12 download_size: 27052 dataset_size: 23408 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "generativeai_sample_data" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
yanabels/churchill-data
2023-09-05T04:04:33.000Z
[ "license:apache-2.0", "region:us" ]
yanabels
null
null
null
0
0
--- license: apache-2.0 dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 5061 num_examples: 17 download_size: 5411 dataset_size: 5061 configs: - config_name: default data_files: - split: train path: data/train-* --- testing!
lohleonard93/physics4kids
2023-09-05T03:53:20.000Z
[ "region:us" ]
lohleonard93
null
null
null
0
0
--- dataset_info: features: - name: topics dtype: string - name: explain dtype: string - name: simplified dtype: string splits: - name: train num_bytes: 13730 num_examples: 10 download_size: 17516 dataset_size: 13730 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "physics4kids" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jmz2023/langchain-docs
2023-09-05T03:55:20.000Z
[ "region:us" ]
jmz2023
null
null
null
0
0
Entry not found
MayG/hf_dataset
2023-09-05T03:58:39.000Z
[ "region:us" ]
MayG
null
null
null
0
0
--- dataset_info: features: - name: product dtype: string - name: description dtype: string - name: marketing_email dtype: string splits: - name: train num_bytes: 19405 num_examples: 10 download_size: 26542 dataset_size: 19405 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "hf_dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
urvog/transcripts-llama2-1k
2023-09-06T03:50:06.000Z
[ "license:apache-2.0", "region:us" ]
urvog
null
null
null
0
0
--- license: apache-2.0 dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 3658329 num_examples: 1000 download_size: 1410820 dataset_size: 3658329 configs: - config_name: default data_files: - split: train path: data/train-* ---
prognosis/guideline-document-v0
2023-09-05T04:00:03.000Z
[ "region:us" ]
prognosis
null
null
null
0
0
Entry not found
kedargsm/marketmail
2023-09-05T04:00:58.000Z
[ "region:us" ]
kedargsm
null
null
null
0
0
--- dataset_info: features: - name: product dtype: string - name: description dtype: string - name: marketing_email dtype: string splits: - name: train num_bytes: 97243 num_examples: 50 download_size: 68524 dataset_size: 97243 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "marketmail" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
schrilax/marketing_campaign_data
2023-09-05T04:46:33.000Z
[ "license:openrail", "region:us" ]
schrilax
null
null
null
0
0
--- license: openrail ---
Jaya1995/Maintenance
2023-09-05T04:15:41.000Z
[ "region:us" ]
Jaya1995
null
null
null
0
0
--- dataset_info: features: - name: sentence dtype: string splits: - name: train num_bytes: 6677 num_examples: 100 download_size: 4106 dataset_size: 6677 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "Maintenance" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
wsin/tobasesentences
2023-09-05T04:12:44.000Z
[ "region:us" ]
wsin
null
null
null
0
0
--- dataset_info: features: - name: sentence dtype: string - name: base base_sentences dtype: string - name: base_sentences dtype: string splits: - name: train num_bytes: 1141 num_examples: 4 download_size: 4035 dataset_size: 1141 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "tobasesentences" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Divya1287/Sentimal_Analysis
2023-09-05T04:25:19.000Z
[ "region:us" ]
Divya1287
null
null
null
0
0
Entry not found
giganion/airoboros-gpt4-m2.0_standardized
2023-09-05T04:35:49.000Z
[ "region:us" ]
giganion
null
null
null
0
0
Entry not found
sachith-surge/LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML-eval-llama2
2023-09-05T08:01:10.000Z
[ "region:us" ]
sachith-surge
null
null
null
0
0
--- dataset_info: features: - name: instruction dtype: string - name: source dtype: string - name: response dtype: string - name: llama2_status dtype: string - name: llama2_rating dtype: string - name: llama2_reason dtype: string splits: - name: train num_bytes: 2241878 num_examples: 1505 download_size: 1173351 dataset_size: 2241878 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
epicfilmu/czzdarma
2023-09-13T10:22:59.000Z
[ "region:us" ]
epicfilmu
null
null
null
0
0
# EpicGames Freebies Claimer ![image](https://user-images.githubusercontent.com/4411977/74479432-6a6d1b00-4eaf-11ea-930f-1b89e7135887.png) ## ⚠️Status⚠️ EpicGames has made captchas mandatory for claiming free games. Currently, epicgames freebies claimer cannot handle this, so [it is not working](https://github.com/Revadike/epicgames-freebies-claimer/issues/172). I am trying to fix it by implementating anti captcha solutions. You can track my progression [here](https://github.com/Revadike/epicgames-freebies-claimer/pull/184). Any help would be greatly appreciated! ## Description Claim [available free game promotions](https://www.epicgames.com/store/free-games) from the Epic Games Store. ## Requirements * [DeviceAuthGenerator](https://github.com/jackblk/DeviceAuthGenerator/releases) * [Git](https://git-scm.com/downloads) * [Node.js](https://nodejs.org/download/) (with build tools checked) > Node version >= 15 ## Instructions - Quick 0. (Optional) ☆ Star this project :) 1. Download/clone this repository 2. Run `npm install` 3. Generate `data/device_auths.json` (using [DeviceAuthGenerator](https://github.com/jackblk/DeviceAuthGenerator)) 4. (Optional) Copy `data/config.example.json` to `data/config.json` and edit it 5. Run `npm start` ## Instructions - Detailed Check out the [wiki](https://github.com/Revadike/epicgames-freebies-claimer/wiki), written by @lucifudge. ## Instructions - Docker Check out the [wiki](https://github.com/Revadike/epicgames-freebies-claimer/wiki/User-Guide-(Docker)), written by @jackblk. ## FAQ ### Why should I use this? This is for the truly lazy, you know who you are. ;) Also, this is a good alternative, in case you don't like using Epic's client or website (and I don't blame you). ### Why should I even bother claiming these free games? To which I will say, why not? Most of these games are actually outstanding games! Even if you don't like Epic and their shenanigans, you will be pleased to know that Epic actually funds all the free copies that are given away: ["But we actually found it was more economical to pay developers [a lump sum] to distribute their game free for two weeks..."](https://arstechnica.com/gaming/2019/03/epic-ceo-youre-going-to-see-lower-prices-on-epic-games-store/) ## Changelog [Full changelog in Wiki](https://github.com/Revadike/epicgames-freebies-claimer/releases) ## Happy Freebie Claiming! ![image](https://user-images.githubusercontent.com/4411977/122922274-bb263b00-d363-11eb-8b82-8a3ed6e7e29d.png)
sammyblues/themerlin-04-09-2023
2023-09-05T05:05:11.000Z
[ "region:us" ]
sammyblues
null
null
null
0
0
Entry not found
open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B
2023-09-11T14:59:23.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of AIDC-ai-business/Marcoroni-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [AIDC-ai-business/Marcoroni-7B](https://huggingface.co/AIDC-ai-business/Marcoroni-7B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-09-11T14:58:05.245524](https://huggingface.co/datasets/open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B/blob/main/results_2023-09-11T14-58-05.245524.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5159772470651705,\n\ \ \"acc_stderr\": 0.03490050368845693,\n \"acc_norm\": 0.5196198874675843,\n\ \ \"acc_norm_stderr\": 0.03488383911166199,\n \"mc1\": 0.3574051407588739,\n\ \ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5084843623108531,\n\ \ \"mc2_stderr\": 0.015788699144390992\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5537542662116041,\n \"acc_stderr\": 0.014526705548539982,\n\ \ \"acc_norm\": 0.5810580204778157,\n \"acc_norm_stderr\": 0.014418106953639013\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6132244572794264,\n\ \ \"acc_stderr\": 0.004860162076330978,\n \"acc_norm\": 0.8008364867556264,\n\ \ \"acc_norm_stderr\": 0.0039855506403304606\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\ \ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\ \ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\ \ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\ \ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776285,\n \ \ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776285\n \ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\ \ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \ \ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n\ \ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\ \ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n\ \ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\ \ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\ \ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n\ \ \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\ \ \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n\ \ \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\ \ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523867,\n \"\ acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523867\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\ \ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\ \ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n\ \ \"acc_stderr\": 0.028229497320317216,\n \"acc_norm\": 0.5612903225806452,\n\ \ \"acc_norm_stderr\": 0.028229497320317216\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n\ \ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\ : 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\ \ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6818181818181818,\n \"acc_stderr\": 0.0331847733384533,\n \"acc_norm\"\ : 0.6818181818181818,\n \"acc_norm_stderr\": 0.0331847733384533\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.031195840877700286,\n\ \ \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.031195840877700286\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.02533466708095495,\n\ \ \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.02533466708095495\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \ \ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.03246816765752174,\n \ \ \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.03246816765752174\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\ acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7192660550458716,\n \"acc_stderr\": 0.019266055045871623,\n \"\ acc_norm\": 0.7192660550458716,\n \"acc_norm_stderr\": 0.019266055045871623\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\ \ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\ : {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n\ \ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \ \ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\ \ \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n\ \ \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\ \ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\ : 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\ \ \"acc_stderr\": 0.046166311118017125,\n \"acc_norm\": 0.6481481481481481,\n\ \ \"acc_norm_stderr\": 0.046166311118017125\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112723,\n\ \ \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112723\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\ \ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \ \ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\ \ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.027236013946196704,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.027236013946196704\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\ : {\n \"acc\": 0.7100893997445722,\n \"acc_stderr\": 0.01622501794477098,\n\ \ \"acc_norm\": 0.7100893997445722,\n \"acc_norm_stderr\": 0.01622501794477098\n\ \ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5809248554913294,\n\ \ \"acc_stderr\": 0.02656417811142262,\n \"acc_norm\": 0.5809248554913294,\n\ \ \"acc_norm_stderr\": 0.02656417811142262\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\ : {\n \"acc\": 0.264804469273743,\n \"acc_stderr\": 0.014756906483260664,\n\ \ \"acc_norm\": 0.264804469273743,\n \"acc_norm_stderr\": 0.014756906483260664\n\ \ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5261437908496732,\n\ \ \"acc_stderr\": 0.028590752958852394,\n \"acc_norm\": 0.5261437908496732,\n\ \ \"acc_norm_stderr\": 0.028590752958852394\n },\n \"harness|hendrycksTest-philosophy|5\"\ : {\n \"acc\": 0.5884244372990354,\n \"acc_stderr\": 0.027950481494401266,\n\ \ \"acc_norm\": 0.5884244372990354,\n \"acc_norm_stderr\": 0.027950481494401266\n\ \ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5648148148148148,\n\ \ \"acc_stderr\": 0.027586006221607708,\n \"acc_norm\": 0.5648148148148148,\n\ \ \"acc_norm_stderr\": 0.027586006221607708\n },\n \"harness|hendrycksTest-professional_accounting|5\"\ : {\n \"acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115882,\n\ \ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115882\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38396349413298564,\n\ \ \"acc_stderr\": 0.01242158783313423,\n \"acc_norm\": 0.38396349413298564,\n\ \ \"acc_norm_stderr\": 0.01242158783313423\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.03036544647727568,\n\ \ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.03036544647727568\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626912,\n \ \ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626912\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\ \ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\ \ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n\ \ \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n\ \ \"acc_stderr\": 0.034457899643627506,\n \"acc_norm\": 0.6119402985074627,\n\ \ \"acc_norm_stderr\": 0.034457899643627506\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \ \ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\ \ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\ \ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.03528211258245229,\n\ \ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.03528211258245229\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n\ \ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5084843623108531,\n\ \ \"mc2_stderr\": 0.015788699144390992\n }\n}\n```" repo_url: https://huggingface.co/AIDC-ai-business/Marcoroni-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|arc:challenge|25_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|arc:challenge|25_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hellaswag|10_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hellaswag|10_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T05:01:15.449449.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-11T14-58-05.245524.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-management|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-58-05.245524.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_05T05_01_15.449449 path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T05:01:15.449449.parquet' - split: 2023_09_11T14_58_05.245524 path: - '**/details_harness|truthfulqa:mc|0_2023-09-11T14-58-05.245524.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-11T14-58-05.245524.parquet' - config_name: results data_files: - split: 2023_09_05T05_01_15.449449 path: - results_2023-09-05T05:01:15.449449.parquet - split: 2023_09_11T14_58_05.245524 path: - results_2023-09-11T14-58-05.245524.parquet - split: latest path: - results_2023-09-11T14-58-05.245524.parquet --- # Dataset Card for Evaluation run of AIDC-ai-business/Marcoroni-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/AIDC-ai-business/Marcoroni-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [AIDC-ai-business/Marcoroni-7B](https://huggingface.co/AIDC-ai-business/Marcoroni-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-11T14:58:05.245524](https://huggingface.co/datasets/open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B/blob/main/results_2023-09-11T14-58-05.245524.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5159772470651705, "acc_stderr": 0.03490050368845693, "acc_norm": 0.5196198874675843, "acc_norm_stderr": 0.03488383911166199, "mc1": 0.3574051407588739, "mc1_stderr": 0.0167765996767294, "mc2": 0.5084843623108531, "mc2_stderr": 0.015788699144390992 }, "harness|arc:challenge|25": { "acc": 0.5537542662116041, "acc_stderr": 0.014526705548539982, "acc_norm": 0.5810580204778157, "acc_norm_stderr": 0.014418106953639013 }, "harness|hellaswag|10": { "acc": 0.6132244572794264, "acc_stderr": 0.004860162076330978, "acc_norm": 0.8008364867556264, "acc_norm_stderr": 0.0039855506403304606 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.48148148148148145, "acc_stderr": 0.043163785995113245, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.47368421052631576, "acc_stderr": 0.04063302731486671, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.04063302731486671 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6, "acc_stderr": 0.030151134457776285, "acc_norm": 0.6, "acc_norm_stderr": 0.030151134457776285 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5625, "acc_stderr": 0.04148415739394154, "acc_norm": 0.5625, "acc_norm_stderr": 0.04148415739394154 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4797687861271676, "acc_stderr": 0.03809342081273957, "acc_norm": 0.4797687861271676, "acc_norm_stderr": 0.03809342081273957 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.29411764705882354, "acc_stderr": 0.04533838195929775, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.04533838195929775 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4851063829787234, "acc_stderr": 0.032671518489247764, "acc_norm": 0.4851063829787234, "acc_norm_stderr": 0.032671518489247764 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.32456140350877194, "acc_stderr": 0.044045561573747664, "acc_norm": 0.32456140350877194, "acc_norm_stderr": 0.044045561573747664 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.45517241379310347, "acc_stderr": 0.04149886942192117, "acc_norm": 0.45517241379310347, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30952380952380953, "acc_stderr": 0.023809523809523867, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.023809523809523867 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2777777777777778, "acc_stderr": 0.040061680838488774, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.040061680838488774 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5612903225806452, "acc_stderr": 0.028229497320317216, "acc_norm": 0.5612903225806452, "acc_norm_stderr": 0.028229497320317216 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3842364532019704, "acc_stderr": 0.0342239856565755, "acc_norm": 0.3842364532019704, "acc_norm_stderr": 0.0342239856565755 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7090909090909091, "acc_stderr": 0.03546563019624336, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.03546563019624336 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6818181818181818, "acc_stderr": 0.0331847733384533, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.0331847733384533 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7512953367875648, "acc_stderr": 0.031195840877700286, "acc_norm": 0.7512953367875648, "acc_norm_stderr": 0.031195840877700286 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.48205128205128206, "acc_stderr": 0.02533466708095495, "acc_norm": 0.48205128205128206, "acc_norm_stderr": 0.02533466708095495 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2518518518518518, "acc_stderr": 0.02646611753895991, "acc_norm": 0.2518518518518518, "acc_norm_stderr": 0.02646611753895991 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5126050420168067, "acc_stderr": 0.03246816765752174, "acc_norm": 0.5126050420168067, "acc_norm_stderr": 0.03246816765752174 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7192660550458716, "acc_stderr": 0.019266055045871623, "acc_norm": 0.7192660550458716, "acc_norm_stderr": 0.019266055045871623 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.375, "acc_stderr": 0.033016908987210894, "acc_norm": 0.375, "acc_norm_stderr": 0.033016908987210894 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.696078431372549, "acc_stderr": 0.03228210387037892, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.03228210387037892 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7215189873417721, "acc_stderr": 0.029178682304842548, "acc_norm": 0.7215189873417721, "acc_norm_stderr": 0.029178682304842548 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5964125560538116, "acc_stderr": 0.03292802819330314, "acc_norm": 0.5964125560538116, "acc_norm_stderr": 0.03292802819330314 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5954198473282443, "acc_stderr": 0.043046937953806645, "acc_norm": 0.5954198473282443, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6446280991735537, "acc_stderr": 0.0436923632657398, "acc_norm": 0.6446280991735537, "acc_norm_stderr": 0.0436923632657398 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6481481481481481, "acc_stderr": 0.046166311118017125, "acc_norm": 0.6481481481481481, "acc_norm_stderr": 0.046166311118017125 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5705521472392638, "acc_stderr": 0.03889066619112723, "acc_norm": 0.5705521472392638, "acc_norm_stderr": 0.03889066619112723 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.375, "acc_stderr": 0.04595091388086298, "acc_norm": 0.375, "acc_norm_stderr": 0.04595091388086298 }, "harness|hendrycksTest-management|5": { "acc": 0.7087378640776699, "acc_stderr": 0.04498676320572924, "acc_norm": 0.7087378640776699, "acc_norm_stderr": 0.04498676320572924 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7777777777777778, "acc_stderr": 0.027236013946196704, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.027236013946196704 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7100893997445722, "acc_stderr": 0.01622501794477098, "acc_norm": 0.7100893997445722, "acc_norm_stderr": 0.01622501794477098 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5809248554913294, "acc_stderr": 0.02656417811142262, "acc_norm": 0.5809248554913294, "acc_norm_stderr": 0.02656417811142262 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.264804469273743, "acc_stderr": 0.014756906483260664, "acc_norm": 0.264804469273743, "acc_norm_stderr": 0.014756906483260664 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5261437908496732, "acc_stderr": 0.028590752958852394, "acc_norm": 0.5261437908496732, "acc_norm_stderr": 0.028590752958852394 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5884244372990354, "acc_stderr": 0.027950481494401266, "acc_norm": 0.5884244372990354, "acc_norm_stderr": 0.027950481494401266 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5648148148148148, "acc_stderr": 0.027586006221607708, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.027586006221607708 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.375886524822695, "acc_stderr": 0.028893955412115882, "acc_norm": 0.375886524822695, "acc_norm_stderr": 0.028893955412115882 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.38396349413298564, "acc_stderr": 0.01242158783313423, "acc_norm": 0.38396349413298564, "acc_norm_stderr": 0.01242158783313423 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4889705882352941, "acc_stderr": 0.03036544647727568, "acc_norm": 0.4889705882352941, "acc_norm_stderr": 0.03036544647727568 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4869281045751634, "acc_stderr": 0.020220920829626912, "acc_norm": 0.4869281045751634, "acc_norm_stderr": 0.020220920829626912 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5909090909090909, "acc_stderr": 0.04709306978661896, "acc_norm": 0.5909090909090909, "acc_norm_stderr": 0.04709306978661896 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6122448979591837, "acc_stderr": 0.031192230726795656, "acc_norm": 0.6122448979591837, "acc_norm_stderr": 0.031192230726795656 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6119402985074627, "acc_stderr": 0.034457899643627506, "acc_norm": 0.6119402985074627, "acc_norm_stderr": 0.034457899643627506 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-virology|5": { "acc": 0.40963855421686746, "acc_stderr": 0.03828401115079022, "acc_norm": 0.40963855421686746, "acc_norm_stderr": 0.03828401115079022 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.695906432748538, "acc_stderr": 0.03528211258245229, "acc_norm": 0.695906432748538, "acc_norm_stderr": 0.03528211258245229 }, "harness|truthfulqa:mc|0": { "mc1": 0.3574051407588739, "mc1_stderr": 0.0167765996767294, "mc2": 0.5084843623108531, "mc2_stderr": 0.015788699144390992 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Undi95__MLewd-L2-13B
2023-09-05T05:07:35.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Undi95/MLewd-L2-13B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Undi95/MLewd-L2-13B](https://huggingface.co/Undi95/MLewd-L2-13B) on the [Open\ \ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__MLewd-L2-13B\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-09-05T05:06:12.728207](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-L2-13B/blob/main/results_2023-09-05T05%3A06%3A12.728207.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5480941554989293,\n\ \ \"acc_stderr\": 0.03458699327783796,\n \"acc_norm\": 0.5520017097570462,\n\ \ \"acc_norm_stderr\": 0.03456746461558375,\n \"mc1\": 0.3390452876376989,\n\ \ \"mc1_stderr\": 0.016571797910626605,\n \"mc2\": 0.4866402159418837,\n\ \ \"mc2_stderr\": 0.015878252541467283\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5435153583617748,\n \"acc_stderr\": 0.01455594976049644,\n\ \ \"acc_norm\": 0.5827645051194539,\n \"acc_norm_stderr\": 0.014409825518403077\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6319458275243975,\n\ \ \"acc_stderr\": 0.004812905279066437,\n \"acc_norm\": 0.8232423819956184,\n\ \ \"acc_norm_stderr\": 0.00380683844816174\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\ \ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\ \ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.04060127035236395,\n\ \ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.04060127035236395\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\ \ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \ \ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.03028500925900979,\n\ \ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.03028500925900979\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\ \ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\ \ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\ : 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\ \ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.4682080924855491,\n\ \ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n\ \ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\ \ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n\ \ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\ \ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\ \ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\ \ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3253968253968254,\n \"acc_stderr\": 0.024130158299762613,\n \"\ acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.024130158299762613\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\ \ \"acc_stderr\": 0.04006168083848878,\n \"acc_norm\": 0.2777777777777778,\n\ \ \"acc_norm_stderr\": 0.04006168083848878\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\ \ \"acc_stderr\": 0.02743086657997347,\n \"acc_norm\": 0.632258064516129,\n\ \ \"acc_norm_stderr\": 0.02743086657997347\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n\ \ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\ : 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\ \ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\ acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245258,\n\ \ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245258\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.025339003010106515,\n\ \ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.025339003010106515\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \ \ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.03238546948758979,\n \ \ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.03238546948758979\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\ acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.726605504587156,\n \"acc_stderr\": 0.019109299846098278,\n \"\ acc_norm\": 0.726605504587156,\n \"acc_norm_stderr\": 0.019109299846098278\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"\ acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"\ acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.70042194092827,\n \"acc_stderr\": 0.029818024749753095,\n \ \ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.029818024749753095\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\ \ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\ \ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\ \ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\ acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\ \ \"acc_stderr\": 0.04414343666854934,\n \"acc_norm\": 0.7037037037037037,\n\ \ \"acc_norm_stderr\": 0.04414343666854934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n\ \ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\ \ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\ \ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503949,\n\ \ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503949\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\ \ \"acc_stderr\": 0.02559819368665225,\n \"acc_norm\": 0.811965811965812,\n\ \ \"acc_norm_stderr\": 0.02559819368665225\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \ \ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7203065134099617,\n\ \ \"acc_stderr\": 0.01605079214803652,\n \"acc_norm\": 0.7203065134099617,\n\ \ \"acc_norm_stderr\": 0.01605079214803652\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\ \ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n\ \ \"acc_stderr\": 0.016269088663959406,\n \"acc_norm\": 0.3843575418994413,\n\ \ \"acc_norm_stderr\": 0.016269088663959406\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.02787074527829027,\n\ \ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.02787074527829027\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\ \ \"acc_stderr\": 0.027648149599751468,\n \"acc_norm\": 0.6141479099678456,\n\ \ \"acc_norm_stderr\": 0.027648149599751468\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.5895061728395061,\n \"acc_stderr\": 0.027371350925124764,\n\ \ \"acc_norm\": 0.5895061728395061,\n \"acc_norm_stderr\": 0.027371350925124764\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255855,\n \ \ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255855\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4074315514993481,\n\ \ \"acc_stderr\": 0.012549473714212226,\n \"acc_norm\": 0.4074315514993481,\n\ \ \"acc_norm_stderr\": 0.012549473714212226\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\ \ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5441176470588235,\n \"acc_stderr\": 0.020148939420415745,\n \ \ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.020148939420415745\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\ \ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\ \ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.0304725260267265,\n\ \ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.0304725260267265\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\ \ \"acc_stderr\": 0.03220024104534205,\n \"acc_norm\": 0.7064676616915423,\n\ \ \"acc_norm_stderr\": 0.03220024104534205\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\ \ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\ \ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\ \ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3390452876376989,\n\ \ \"mc1_stderr\": 0.016571797910626605,\n \"mc2\": 0.4866402159418837,\n\ \ \"mc2_stderr\": 0.015878252541467283\n }\n}\n```" repo_url: https://huggingface.co/Undi95/MLewd-L2-13B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|arc:challenge|25_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hellaswag|10_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T05:06:12.728207.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:06:12.728207.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_05T05_06_12.728207 path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T05:06:12.728207.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T05:06:12.728207.parquet' - config_name: results data_files: - split: 2023_09_05T05_06_12.728207 path: - results_2023-09-05T05:06:12.728207.parquet - split: latest path: - results_2023-09-05T05:06:12.728207.parquet --- # Dataset Card for Evaluation run of Undi95/MLewd-L2-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Undi95/MLewd-L2-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Undi95/MLewd-L2-13B](https://huggingface.co/Undi95/MLewd-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Undi95__MLewd-L2-13B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-05T05:06:12.728207](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-L2-13B/blob/main/results_2023-09-05T05%3A06%3A12.728207.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5480941554989293, "acc_stderr": 0.03458699327783796, "acc_norm": 0.5520017097570462, "acc_norm_stderr": 0.03456746461558375, "mc1": 0.3390452876376989, "mc1_stderr": 0.016571797910626605, "mc2": 0.4866402159418837, "mc2_stderr": 0.015878252541467283 }, "harness|arc:challenge|25": { "acc": 0.5435153583617748, "acc_stderr": 0.01455594976049644, "acc_norm": 0.5827645051194539, "acc_norm_stderr": 0.014409825518403077 }, "harness|hellaswag|10": { "acc": 0.6319458275243975, "acc_stderr": 0.004812905279066437, "acc_norm": 0.8232423819956184, "acc_norm_stderr": 0.00380683844816174 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750574, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750574 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5328947368421053, "acc_stderr": 0.04060127035236395, "acc_norm": 0.5328947368421053, "acc_norm_stderr": 0.04060127035236395 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5886792452830188, "acc_stderr": 0.03028500925900979, "acc_norm": 0.5886792452830188, "acc_norm_stderr": 0.03028500925900979 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5555555555555556, "acc_stderr": 0.041553199555931467, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.041553199555931467 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4682080924855491, "acc_stderr": 0.03804749744364764, "acc_norm": 0.4682080924855491, "acc_norm_stderr": 0.03804749744364764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3137254901960784, "acc_stderr": 0.04617034827006716, "acc_norm": 0.3137254901960784, "acc_norm_stderr": 0.04617034827006716 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.43829787234042555, "acc_stderr": 0.03243618636108101, "acc_norm": 0.43829787234042555, "acc_norm_stderr": 0.03243618636108101 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4896551724137931, "acc_stderr": 0.04165774775728763, "acc_norm": 0.4896551724137931, "acc_norm_stderr": 0.04165774775728763 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3253968253968254, "acc_stderr": 0.024130158299762613, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.024130158299762613 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2777777777777778, "acc_stderr": 0.04006168083848878, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.04006168083848878 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.632258064516129, "acc_stderr": 0.02743086657997347, "acc_norm": 0.632258064516129, "acc_norm_stderr": 0.02743086657997347 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.43349753694581283, "acc_stderr": 0.03486731727419872, "acc_norm": 0.43349753694581283, "acc_norm_stderr": 0.03486731727419872 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6909090909090909, "acc_stderr": 0.036085410115739666, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.036085410115739666 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6868686868686869, "acc_stderr": 0.033042050878136525, "acc_norm": 0.6868686868686869, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8082901554404145, "acc_stderr": 0.028408953626245258, "acc_norm": 0.8082901554404145, "acc_norm_stderr": 0.028408953626245258 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5153846153846153, "acc_stderr": 0.025339003010106515, "acc_norm": 0.5153846153846153, "acc_norm_stderr": 0.025339003010106515 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.29259259259259257, "acc_stderr": 0.02773896963217609, "acc_norm": 0.29259259259259257, "acc_norm_stderr": 0.02773896963217609 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5378151260504201, "acc_stderr": 0.03238546948758979, "acc_norm": 0.5378151260504201, "acc_norm_stderr": 0.03238546948758979 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526733, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526733 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.726605504587156, "acc_stderr": 0.019109299846098278, "acc_norm": 0.726605504587156, "acc_norm_stderr": 0.019109299846098278 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3287037037037037, "acc_stderr": 0.03203614084670058, "acc_norm": 0.3287037037037037, "acc_norm_stderr": 0.03203614084670058 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7352941176470589, "acc_stderr": 0.030964517926923403, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.030964517926923403 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.70042194092827, "acc_stderr": 0.029818024749753095, "acc_norm": 0.70042194092827, "acc_norm_stderr": 0.029818024749753095 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6793893129770993, "acc_stderr": 0.04093329229834278, "acc_norm": 0.6793893129770993, "acc_norm_stderr": 0.04093329229834278 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7355371900826446, "acc_stderr": 0.040261875275912073, "acc_norm": 0.7355371900826446, "acc_norm_stderr": 0.040261875275912073 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7037037037037037, "acc_stderr": 0.04414343666854934, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.04414343666854934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6748466257668712, "acc_stderr": 0.036803503712864616, "acc_norm": 0.6748466257668712, "acc_norm_stderr": 0.036803503712864616 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3482142857142857, "acc_stderr": 0.04521829902833586, "acc_norm": 0.3482142857142857, "acc_norm_stderr": 0.04521829902833586 }, "harness|hendrycksTest-management|5": { "acc": 0.6990291262135923, "acc_stderr": 0.04541609446503949, "acc_norm": 0.6990291262135923, "acc_norm_stderr": 0.04541609446503949 }, "harness|hendrycksTest-marketing|5": { "acc": 0.811965811965812, "acc_stderr": 0.02559819368665225, "acc_norm": 0.811965811965812, "acc_norm_stderr": 0.02559819368665225 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7203065134099617, "acc_stderr": 0.01605079214803652, "acc_norm": 0.7203065134099617, "acc_norm_stderr": 0.01605079214803652 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6242774566473989, "acc_stderr": 0.02607431485165708, "acc_norm": 0.6242774566473989, "acc_norm_stderr": 0.02607431485165708 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3843575418994413, "acc_stderr": 0.016269088663959406, "acc_norm": 0.3843575418994413, "acc_norm_stderr": 0.016269088663959406 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6143790849673203, "acc_stderr": 0.02787074527829027, "acc_norm": 0.6143790849673203, "acc_norm_stderr": 0.02787074527829027 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6141479099678456, "acc_stderr": 0.027648149599751468, "acc_norm": 0.6141479099678456, "acc_norm_stderr": 0.027648149599751468 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5895061728395061, "acc_stderr": 0.027371350925124764, "acc_norm": 0.5895061728395061, "acc_norm_stderr": 0.027371350925124764 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.42907801418439717, "acc_stderr": 0.02952591430255855, "acc_norm": 0.42907801418439717, "acc_norm_stderr": 0.02952591430255855 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4074315514993481, "acc_stderr": 0.012549473714212226, "acc_norm": 0.4074315514993481, "acc_norm_stderr": 0.012549473714212226 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5147058823529411, "acc_stderr": 0.03035969707904612, "acc_norm": 0.5147058823529411, "acc_norm_stderr": 0.03035969707904612 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5441176470588235, "acc_stderr": 0.020148939420415745, "acc_norm": 0.5441176470588235, "acc_norm_stderr": 0.020148939420415745 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6272727272727273, "acc_stderr": 0.04631381319425465, "acc_norm": 0.6272727272727273, "acc_norm_stderr": 0.04631381319425465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6530612244897959, "acc_stderr": 0.0304725260267265, "acc_norm": 0.6530612244897959, "acc_norm_stderr": 0.0304725260267265 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7064676616915423, "acc_stderr": 0.03220024104534205, "acc_norm": 0.7064676616915423, "acc_norm_stderr": 0.03220024104534205 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.463855421686747, "acc_stderr": 0.03882310850890593, "acc_norm": 0.463855421686747, "acc_norm_stderr": 0.03882310850890593 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7309941520467836, "acc_stderr": 0.03401052620104089, "acc_norm": 0.7309941520467836, "acc_norm_stderr": 0.03401052620104089 }, "harness|truthfulqa:mc|0": { "mc1": 0.3390452876376989, "mc1_stderr": 0.016571797910626605, "mc2": 0.4866402159418837, "mc2_stderr": 0.015878252541467283 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
HYF12053/SD0000
2023-09-25T19:03:52.000Z
[ "region:us" ]
HYF12053
null
null
null
0
0
Entry not found
zxvix/pubmed_100
2023-09-05T05:14:30.000Z
[ "region:us" ]
zxvix
null
null
null
0
0
--- configs: - config_name: default data_files: - split: test path: data/test-* dataset_info: features: - name: MedlineCitation struct: - name: PMID dtype: int32 - name: DateCompleted struct: - name: Year dtype: int32 - name: Month dtype: int32 - name: Day dtype: int32 - name: NumberOfReferences dtype: int32 - name: DateRevised struct: - name: Year dtype: int32 - name: Month dtype: int32 - name: Day dtype: int32 - name: Article struct: - name: Abstract struct: - name: AbstractText dtype: string - name: ArticleTitle dtype: string - name: AuthorList struct: - name: Author sequence: - name: LastName dtype: string - name: ForeName dtype: string - name: Initials dtype: string - name: CollectiveName dtype: string - name: Language dtype: string - name: GrantList struct: - name: Grant sequence: - name: GrantID dtype: string - name: Agency dtype: string - name: Country dtype: string - name: PublicationTypeList struct: - name: PublicationType sequence: string - name: MedlineJournalInfo struct: - name: Country dtype: string - name: ChemicalList struct: - name: Chemical sequence: - name: RegistryNumber dtype: string - name: NameOfSubstance dtype: string - name: CitationSubset dtype: string - name: MeshHeadingList struct: - name: MeshHeading sequence: - name: DescriptorName dtype: string - name: QualifierName dtype: string - name: PubmedData struct: - name: ArticleIdList sequence: - name: ArticleId sequence: string - name: PublicationStatus dtype: string - name: History struct: - name: PubMedPubDate sequence: - name: Year dtype: int32 - name: Month dtype: int32 - name: Day dtype: int32 - name: ReferenceList sequence: - name: Citation dtype: string - name: CitationId dtype: int32 - name: text dtype: string - name: title dtype: string splits: - name: test num_bytes: 303320.4166457245 num_examples: 100 download_size: 214047 dataset_size: 303320.4166457245 --- # Dataset Card for "pubmed_100" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Orvildsilva/Task
2023-09-05T10:41:58.000Z
[ "region:us" ]
Orvildsilva
null
null
null
0
0
Entry not found
yycho0108/trajdiffuser-diverse-object-dataset
2023-09-06T04:40:52.000Z
[ "license:mit", "region:us" ]
yycho0108
null
null
null
0
0
--- license: mit ---
sidharthsingh1892/cobol_to_java_new_dataset
2023-09-05T05:24:41.000Z
[ "region:us" ]
sidharthsingh1892
null
null
null
0
0
Entry not found
quietwhisper/rvc-bd3-karlach
2023-09-05T05:24:50.000Z
[ "region:us" ]
quietwhisper
null
null
null
0
0
Entry not found
zxvix/pubmed_nonbiomedical_100
2023-09-05T05:26:07.000Z
[ "region:us" ]
zxvix
null
null
null
0
0
--- configs: - config_name: default data_files: - split: test path: data/test-* dataset_info: features: - name: MedlineCitation struct: - name: PMID dtype: int32 - name: DateCompleted struct: - name: Year dtype: int32 - name: Month dtype: int32 - name: Day dtype: int32 - name: NumberOfReferences dtype: int32 - name: DateRevised struct: - name: Year dtype: int32 - name: Month dtype: int32 - name: Day dtype: int32 - name: Article struct: - name: Abstract struct: - name: AbstractText dtype: string - name: ArticleTitle dtype: string - name: AuthorList struct: - name: Author sequence: - name: LastName dtype: string - name: ForeName dtype: string - name: Initials dtype: string - name: CollectiveName dtype: string - name: Language dtype: string - name: GrantList struct: - name: Grant sequence: - name: GrantID dtype: string - name: Agency dtype: string - name: Country dtype: string - name: PublicationTypeList struct: - name: PublicationType sequence: string - name: MedlineJournalInfo struct: - name: Country dtype: string - name: ChemicalList struct: - name: Chemical sequence: - name: RegistryNumber dtype: string - name: NameOfSubstance dtype: string - name: CitationSubset dtype: string - name: MeshHeadingList struct: - name: MeshHeading sequence: - name: DescriptorName dtype: string - name: QualifierName dtype: string - name: PubmedData struct: - name: ArticleIdList sequence: - name: ArticleId sequence: string - name: PublicationStatus dtype: string - name: History struct: - name: PubMedPubDate sequence: - name: Year dtype: int32 - name: Month dtype: int32 - name: Day dtype: int32 - name: ReferenceList sequence: - name: Citation dtype: string - name: CitationId dtype: int32 - name: text dtype: string - name: title dtype: string - name: original_text dtype: string splits: - name: test num_bytes: 412796.0 num_examples: 100 download_size: 281974 dataset_size: 412796.0 --- # Dataset Card for "pubmed_nonbiomedical_100" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_Undi95__ReMM-L2-13B-PIPPA
2023-09-05T05:31:11.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
null
0
0
--- pretty_name: Evaluation run of Undi95/ReMM-L2-13B-PIPPA dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Undi95/ReMM-L2-13B-PIPPA](https://huggingface.co/Undi95/ReMM-L2-13B-PIPPA) on\ \ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__ReMM-L2-13B-PIPPA\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-09-05T05:29:49.738166](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-L2-13B-PIPPA/blob/main/results_2023-09-05T05%3A29%3A49.738166.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5431763390415001,\n\ \ \"acc_stderr\": 0.034452564290971044,\n \"acc_norm\": 0.5468802394769329,\n\ \ \"acc_norm_stderr\": 0.03443249266964571,\n \"mc1\": 0.35128518971848227,\n\ \ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.49935182390993416,\n\ \ \"mc2_stderr\": 0.01574809606103773\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650654,\n\ \ \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.014332236306790149\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6374228241386178,\n\ \ \"acc_stderr\": 0.0047976167543723105,\n \"acc_norm\": 0.8312089225253934,\n\ \ \"acc_norm_stderr\": 0.0037380177340378636\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\ \ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\ \ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\ \ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\ \ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n\ \ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\ \ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\ \ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\ \ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\ \ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087764,\n\ \ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087764\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\ \ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\ \ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\ \ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\ \ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\ \ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.328042328042328,\n \"acc_stderr\": 0.024180497164376896,\n \"\ acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376896\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\ \ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\ \ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\ \ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n\ \ \"acc_stderr\": 0.027528904299845704,\n \"acc_norm\": 0.6258064516129033,\n\ \ \"acc_norm_stderr\": 0.027528904299845704\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n\ \ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\ : 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\ \ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6818181818181818,\n \"acc_stderr\": 0.0331847733384533,\n \"acc_norm\"\ : 0.6818181818181818,\n \"acc_norm_stderr\": 0.0331847733384533\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817247,\n\ \ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817247\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.517948717948718,\n \"acc_stderr\": 0.025334667080954925,\n \ \ \"acc_norm\": 0.517948717948718,\n \"acc_norm_stderr\": 0.025334667080954925\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \ \ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \ \ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\ acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.710091743119266,\n \"acc_stderr\": 0.019453066609201597,\n \"\ acc_norm\": 0.710091743119266,\n \"acc_norm_stderr\": 0.019453066609201597\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"\ acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\ acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n \ \ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\ \ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\ \ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\ \ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\ acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\ \ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\ \ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935575,\n\ \ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935575\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\ \ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\ \ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.04721188506097172,\n\ \ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.04721188506097172\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n\ \ \"acc_stderr\": 0.02760192138141759,\n \"acc_norm\": 0.7692307692307693,\n\ \ \"acc_norm_stderr\": 0.02760192138141759\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \ \ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.735632183908046,\n\ \ \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.735632183908046,\n\ \ \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.025950054337654068,\n\ \ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.025950054337654068\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n\ \ \"acc_stderr\": 0.015382845587584525,\n \"acc_norm\": 0.3039106145251397,\n\ \ \"acc_norm_stderr\": 0.015382845587584525\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.02787074527829028,\n\ \ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.02787074527829028\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\ \ \"acc_stderr\": 0.027316847674192707,\n \"acc_norm\": 0.6366559485530546,\n\ \ \"acc_norm_stderr\": 0.027316847674192707\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132146,\n\ \ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132146\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.40425531914893614,\n \"acc_stderr\": 0.02927553215970473,\n \ \ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.02927553215970473\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n\ \ \"acc_stderr\": 0.012610325733489906,\n \"acc_norm\": 0.4211212516297262,\n\ \ \"acc_norm_stderr\": 0.012610325733489906\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\ \ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.553921568627451,\n \"acc_stderr\": 0.02010986454718136,\n \ \ \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.02010986454718136\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\ \ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\ \ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872468,\n\ \ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872468\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\ \ \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n\ \ \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \ \ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\ \ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.4397590361445783,\n\ \ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\ \ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\ \ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.49935182390993416,\n\ \ \"mc2_stderr\": 0.01574809606103773\n }\n}\n```" repo_url: https://huggingface.co/Undi95/ReMM-L2-13B-PIPPA leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|arc:challenge|25_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hellaswag|10_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-05T05:29:49.738166.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:29:49.738166.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_05T05_29_49.738166 path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T05:29:49.738166.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-05T05:29:49.738166.parquet' - config_name: results data_files: - split: 2023_09_05T05_29_49.738166 path: - results_2023-09-05T05:29:49.738166.parquet - split: latest path: - results_2023-09-05T05:29:49.738166.parquet --- # Dataset Card for Evaluation run of Undi95/ReMM-L2-13B-PIPPA ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Undi95/ReMM-L2-13B-PIPPA - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Undi95/ReMM-L2-13B-PIPPA](https://huggingface.co/Undi95/ReMM-L2-13B-PIPPA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Undi95__ReMM-L2-13B-PIPPA", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-05T05:29:49.738166](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-L2-13B-PIPPA/blob/main/results_2023-09-05T05%3A29%3A49.738166.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5431763390415001, "acc_stderr": 0.034452564290971044, "acc_norm": 0.5468802394769329, "acc_norm_stderr": 0.03443249266964571, "mc1": 0.35128518971848227, "mc1_stderr": 0.016711358163544403, "mc2": 0.49935182390993416, "mc2_stderr": 0.01574809606103773 }, "harness|arc:challenge|25": { "acc": 0.5725255972696246, "acc_stderr": 0.014456862944650654, "acc_norm": 0.5972696245733788, "acc_norm_stderr": 0.014332236306790149 }, "harness|hellaswag|10": { "acc": 0.6374228241386178, "acc_stderr": 0.0047976167543723105, "acc_norm": 0.8312089225253934, "acc_norm_stderr": 0.0037380177340378636 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5111111111111111, "acc_stderr": 0.04318275491977976, "acc_norm": 0.5111111111111111, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4934210526315789, "acc_stderr": 0.040685900502249704, "acc_norm": 0.4934210526315789, "acc_norm_stderr": 0.040685900502249704 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6075471698113207, "acc_stderr": 0.03005258057955785, "acc_norm": 0.6075471698113207, "acc_norm_stderr": 0.03005258057955785 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5972222222222222, "acc_stderr": 0.04101405519842426, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.04101405519842426 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.35, "acc_stderr": 0.04793724854411019, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5028901734104047, "acc_stderr": 0.038124005659748335, "acc_norm": 0.5028901734104047, "acc_norm_stderr": 0.038124005659748335 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2647058823529412, "acc_stderr": 0.043898699568087764, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.043898699568087764 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.42127659574468085, "acc_stderr": 0.03227834510146268, "acc_norm": 0.42127659574468085, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5103448275862069, "acc_stderr": 0.04165774775728763, "acc_norm": 0.5103448275862069, "acc_norm_stderr": 0.04165774775728763 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.328042328042328, "acc_stderr": 0.024180497164376896, "acc_norm": 0.328042328042328, "acc_norm_stderr": 0.024180497164376896 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.31746031746031744, "acc_stderr": 0.04163453031302859, "acc_norm": 0.31746031746031744, "acc_norm_stderr": 0.04163453031302859 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6258064516129033, "acc_stderr": 0.027528904299845704, "acc_norm": 0.6258064516129033, "acc_norm_stderr": 0.027528904299845704 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4088669950738916, "acc_stderr": 0.034590588158832314, "acc_norm": 0.4088669950738916, "acc_norm_stderr": 0.034590588158832314 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6666666666666666, "acc_stderr": 0.0368105086916155, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.0368105086916155 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6818181818181818, "acc_stderr": 0.0331847733384533, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.0331847733384533 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7875647668393783, "acc_stderr": 0.029519282616817247, "acc_norm": 0.7875647668393783, "acc_norm_stderr": 0.029519282616817247 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.517948717948718, "acc_stderr": 0.025334667080954925, "acc_norm": 0.517948717948718, "acc_norm_stderr": 0.025334667080954925 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2740740740740741, "acc_stderr": 0.027195934804085622, "acc_norm": 0.2740740740740741, "acc_norm_stderr": 0.027195934804085622 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5252100840336135, "acc_stderr": 0.03243718055137411, "acc_norm": 0.5252100840336135, "acc_norm_stderr": 0.03243718055137411 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2781456953642384, "acc_stderr": 0.03658603262763743, "acc_norm": 0.2781456953642384, "acc_norm_stderr": 0.03658603262763743 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.710091743119266, "acc_stderr": 0.019453066609201597, "acc_norm": 0.710091743119266, "acc_norm_stderr": 0.019453066609201597 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3888888888888889, "acc_stderr": 0.033247089118091176, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.033247089118091176 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7696078431372549, "acc_stderr": 0.029554292605695066, "acc_norm": 0.7696078431372549, "acc_norm_stderr": 0.029554292605695066 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7130801687763713, "acc_stderr": 0.02944377302259469, "acc_norm": 0.7130801687763713, "acc_norm_stderr": 0.02944377302259469 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6636771300448431, "acc_stderr": 0.031708824268455005, "acc_norm": 0.6636771300448431, "acc_norm_stderr": 0.031708824268455005 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6564885496183206, "acc_stderr": 0.041649760719448786, "acc_norm": 0.6564885496183206, "acc_norm_stderr": 0.041649760719448786 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04065578140908705, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04065578140908705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6625766871165644, "acc_stderr": 0.03714908409935575, "acc_norm": 0.6625766871165644, "acc_norm_stderr": 0.03714908409935575 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.33035714285714285, "acc_stderr": 0.04464285714285714, "acc_norm": 0.33035714285714285, "acc_norm_stderr": 0.04464285714285714 }, "harness|hendrycksTest-management|5": { "acc": 0.6504854368932039, "acc_stderr": 0.04721188506097172, "acc_norm": 0.6504854368932039, "acc_norm_stderr": 0.04721188506097172 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7692307692307693, "acc_stderr": 0.02760192138141759, "acc_norm": 0.7692307692307693, "acc_norm_stderr": 0.02760192138141759 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.735632183908046, "acc_stderr": 0.01576998484069052, "acc_norm": 0.735632183908046, "acc_norm_stderr": 0.01576998484069052 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6329479768786127, "acc_stderr": 0.025950054337654068, "acc_norm": 0.6329479768786127, "acc_norm_stderr": 0.025950054337654068 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3039106145251397, "acc_stderr": 0.015382845587584525, "acc_norm": 0.3039106145251397, "acc_norm_stderr": 0.015382845587584525 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6143790849673203, "acc_stderr": 0.02787074527829028, "acc_norm": 0.6143790849673203, "acc_norm_stderr": 0.02787074527829028 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6366559485530546, "acc_stderr": 0.027316847674192707, "acc_norm": 0.6366559485530546, "acc_norm_stderr": 0.027316847674192707 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6141975308641975, "acc_stderr": 0.027085401226132146, "acc_norm": 0.6141975308641975, "acc_norm_stderr": 0.027085401226132146 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.40425531914893614, "acc_stderr": 0.02927553215970473, "acc_norm": 0.40425531914893614, "acc_norm_stderr": 0.02927553215970473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4211212516297262, "acc_stderr": 0.012610325733489906, "acc_norm": 0.4211212516297262, "acc_norm_stderr": 0.012610325733489906 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5073529411764706, "acc_stderr": 0.030369552523902173, "acc_norm": 0.5073529411764706, "acc_norm_stderr": 0.030369552523902173 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.553921568627451, "acc_stderr": 0.02010986454718136, "acc_norm": 0.553921568627451, "acc_norm_stderr": 0.02010986454718136 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.046075820907199756, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.046075820907199756 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6204081632653061, "acc_stderr": 0.031067211262872468, "acc_norm": 0.6204081632653061, "acc_norm_stderr": 0.031067211262872468 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7014925373134329, "acc_stderr": 0.03235743789355042, "acc_norm": 0.7014925373134329, "acc_norm_stderr": 0.03235743789355042 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.4397590361445783, "acc_stderr": 0.03864139923699122, "acc_norm": 0.4397590361445783, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7309941520467836, "acc_stderr": 0.03401052620104089, "acc_norm": 0.7309941520467836, "acc_norm_stderr": 0.03401052620104089 }, "harness|truthfulqa:mc|0": { "mc1": 0.35128518971848227, "mc1_stderr": 0.016711358163544403, "mc2": 0.49935182390993416, "mc2_stderr": 0.01574809606103773 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
DynamicSuperb/DialogueActPairing_DailyTalk
2023-09-09T12:10:22.000Z
[ "region:us" ]
DynamicSuperb
null
null
null
0
0
--- dataset_info: features: - name: file dtype: string - name: audio dtype: audio - name: file2 dtype: string - name: audio2 dtype: audio - name: instruction dtype: string - name: label dtype: string splits: - name: test num_bytes: 1146408167.0 num_examples: 2000 download_size: 1062179728 dataset_size: 1146408167.0 --- # Dataset Card for "DialogueActPairing_DailyTalk" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
abdulhade/AsosoftWhisperv2
2023-09-05T09:02:10.000Z
[ "region:us" ]
abdulhade
null
null
null
0
0
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: input_features sequence: sequence: float32 - name: labels sequence: int64 splits: - name: train num_bytes: 28038699208 num_examples: 29188 download_size: 4307818668 dataset_size: 28038699208 --- # Dataset Card for "AsosoftWhisperv2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
indrii/testtt
2023-09-05T05:56:06.000Z
[ "region:us" ]
indrii
null
null
null
0
0
Entry not found
foxxy-hm/slu-augmented-data
2023-09-05T16:59:46.000Z
[ "region:us" ]
foxxy-hm
null
null
null
0
0
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: speech sequence: float64 - name: sampling_rate dtype: int64 - name: target_text dtype: string splits: - name: train num_bytes: 3720263051.262795 num_examples: 7190 - name: test num_bytes: 930324473.7372051 num_examples: 1798 download_size: 2043481654 dataset_size: 4650587525.0 --- # Dataset Card for "slu-augmented-data" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)