datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
usp-esalq/llama-2-papers | ---
license: mit
---
|
open-llm-leaderboard/details_maldv__dragonwar-7b-alpha | ---
pretty_name: Evaluation run of maldv/dragonwar-7b-alpha
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [maldv/dragonwar-7b-alpha](https://huggingface.co/maldv/dragonwar-7b-alpha) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maldv__dragonwar-7b-alpha\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-05T14:25:13.678265](https://huggingface.co/datasets/open-llm-leaderboard/details_maldv__dragonwar-7b-alpha/blob/main/results_2024-04-05T14-25-13.678265.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5637520301142515,\n\
\ \"acc_stderr\": 0.0337124492685975,\n \"acc_norm\": 0.5712635003505222,\n\
\ \"acc_norm_stderr\": 0.03443150810775058,\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.43727256663199443,\n\
\ \"mc2_stderr\": 0.014854735175793955\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5409556313993175,\n \"acc_stderr\": 0.014562291073601229,\n\
\ \"acc_norm\": 0.5656996587030717,\n \"acc_norm_stderr\": 0.014484703048857357\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6203943437562238,\n\
\ \"acc_stderr\": 0.004842969887794103,\n \"acc_norm\": 0.8153754232224656,\n\
\ \"acc_norm_stderr\": 0.003871997616734269\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464241,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464241\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842508,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842508\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\
\ \"acc_stderr\": 0.027327548447957543,\n \"acc_norm\": 0.6387096774193548,\n\
\ \"acc_norm_stderr\": 0.027327548447957543\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.03345678422756776,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.03345678422756776\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710862,\n\
\ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710862\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.02794045713622842,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.02794045713622842\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7467889908256881,\n \"acc_stderr\": 0.01864407304137504,\n \"\
acc_norm\": 0.7467889908256881,\n \"acc_norm_stderr\": 0.01864407304137504\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.033622774366080424,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.033622774366080424\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335452,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335452\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.722860791826309,\n\
\ \"acc_stderr\": 0.01600563629412242,\n \"acc_norm\": 0.722860791826309,\n\
\ \"acc_norm_stderr\": 0.01600563629412242\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531015,\n\
\ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531015\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3541899441340782,\n\
\ \"acc_stderr\": 0.015995644947299235,\n \"acc_norm\": 0.3541899441340782,\n\
\ \"acc_norm_stderr\": 0.015995644947299235\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.02705797462449438,\n\
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.02705797462449438\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.027264297599804012,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.027264297599804012\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.02712511551316686,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.02712511551316686\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n\
\ \"acc_stderr\": 0.012633353557534425,\n \"acc_norm\": 0.42698826597131684,\n\
\ \"acc_norm_stderr\": 0.012633353557534425\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5620915032679739,\n \"acc_stderr\": 0.02007125788688653,\n \
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.02007125788688653\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117825,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117825\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.43727256663199443,\n\
\ \"mc2_stderr\": 0.014854735175793955\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7008681925808997,\n \"acc_stderr\": 0.012868639066091536\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20318423047763456,\n \
\ \"acc_stderr\": 0.011083227665267797\n }\n}\n```"
repo_url: https://huggingface.co/maldv/dragonwar-7b-alpha
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|arc:challenge|25_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|gsm8k|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hellaswag|10_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-25-13.678265.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T14-25-13.678265.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- '**/details_harness|winogrande|5_2024-04-05T14-25-13.678265.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-05T14-25-13.678265.parquet'
- config_name: results
data_files:
- split: 2024_04_05T14_25_13.678265
path:
- results_2024-04-05T14-25-13.678265.parquet
- split: latest
path:
- results_2024-04-05T14-25-13.678265.parquet
---
# Dataset Card for Evaluation run of maldv/dragonwar-7b-alpha
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [maldv/dragonwar-7b-alpha](https://huggingface.co/maldv/dragonwar-7b-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maldv__dragonwar-7b-alpha",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-05T14:25:13.678265](https://huggingface.co/datasets/open-llm-leaderboard/details_maldv__dragonwar-7b-alpha/blob/main/results_2024-04-05T14-25-13.678265.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5637520301142515,
"acc_stderr": 0.0337124492685975,
"acc_norm": 0.5712635003505222,
"acc_norm_stderr": 0.03443150810775058,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.43727256663199443,
"mc2_stderr": 0.014854735175793955
},
"harness|arc:challenge|25": {
"acc": 0.5409556313993175,
"acc_stderr": 0.014562291073601229,
"acc_norm": 0.5656996587030717,
"acc_norm_stderr": 0.014484703048857357
},
"harness|hellaswag|10": {
"acc": 0.6203943437562238,
"acc_stderr": 0.004842969887794103,
"acc_norm": 0.8153754232224656,
"acc_norm_stderr": 0.003871997616734269
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464241,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464241
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.02422996529842508,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.02422996529842508
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.027327548447957543,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.027327548447957543
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.03345678422756776,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.03345678422756776
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710862,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710862
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.02794045713622842,
"acc_norm": 0.3,
"acc_norm_stderr": 0.02794045713622842
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7467889908256881,
"acc_stderr": 0.01864407304137504,
"acc_norm": 0.7467889908256881,
"acc_norm_stderr": 0.01864407304137504
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335452,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335452
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.722860791826309,
"acc_stderr": 0.01600563629412242,
"acc_norm": 0.722860791826309,
"acc_norm_stderr": 0.01600563629412242
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531015,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531015
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3541899441340782,
"acc_stderr": 0.015995644947299235,
"acc_norm": 0.3541899441340782,
"acc_norm_stderr": 0.015995644947299235
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.02705797462449438,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.02705797462449438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804012,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804012
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.02712511551316686,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.02712511551316686
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534425,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534425
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.02007125788688653,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.02007125788688653
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117825,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117825
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.43727256663199443,
"mc2_stderr": 0.014854735175793955
},
"harness|winogrande|5": {
"acc": 0.7008681925808997,
"acc_stderr": 0.012868639066091536
},
"harness|gsm8k|5": {
"acc": 0.20318423047763456,
"acc_stderr": 0.011083227665267797
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
samantaofc/seemyaa | ---
license: openrail
---
|
NavidVafaei/rottentomato01 | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- cc-by-nc-nd-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- summarization
task_ids: []
paperswithcode_id: rottento
pretty_name: rottento Corpus
tags:
- conversations-summarization
dataset_info:
features:
- name: movie
dtype: string
- name: id
dtype: string
- name: reviews
dtype: array
- name: summary
dtype: string
config_name: rottento
splits:
- name: train
num_bytes: 9479141
num_examples: 14732
- name: test
num_bytes: 534492
num_examples: 819
- name: validation
num_bytes: 516431
num_examples: 818
download_size: 2944100
dataset_size: 10530064
train-eval-index:
- config: rottento
task: summarization
task_id: summarization
splits:
eval_split: test
col_mapping:
dialogue: text
summary: target
---
# Dataset Card for rottentoCorpus
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://arxiv.org/abs/1911.12237v2
- **Repository:** [Needs More Information]
- **Paper:** https://arxiv.org/abs/1911.12237v2
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
The SAMSum dataset contains about 16k messenger-like conversations with summaries. Conversations were created and written down by linguists fluent in English. Linguists were asked to create conversations similar to those they write on a daily basis, reflecting the proportion of topics of their real-life messenger convesations. The style and register are diversified - conversations could be informal, semi-formal or formal, they may contain slang words, emoticons and typos. Then, the conversations were annotated with summaries. It was assumed that summaries should be a concise brief of what people talked about in the conversation in third person.
The SAMSum dataset was prepared by Samsung R&D Institute Poland and is distributed for research purposes (non-commercial licence: CC BY-NC-ND 4.0).
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
English
## Dataset Structure
### Data Instances
The created dataset is made of 16369 conversations distributed uniformly into 4 groups based on the number of utterances in con- versations: 3-6, 7-12, 13-18 and 19-30. Each utterance contains the name of the speaker. Most conversations consist of dialogues between two interlocutors (about 75% of all conversations), the rest is between three or more people
The first instance in the training set:
{'id': '13818513', 'summary': 'Amanda baked cookies and will bring Jerry some tomorrow.', 'dialogue': "Amanda: I baked cookies. Do you want some?\r\nJerry: Sure!\r\nAmanda: I'll bring you tomorrow :-)"}
### Data Fields
- dialogue: text of dialogue.
- summary: human written summary of the dialogue.
- id: unique id of an example.
### Data Splits
- train: 14732
- val: 818
- test: 819
## Dataset Creation
### Curation Rationale
In paper:
> In the first approach, we reviewed datasets from the following categories: chatbot dialogues, SMS corpora, IRC/chat data, movie dialogues, tweets, comments data (conversations formed by replies to comments), transcription of meetings, written discussions, phone dialogues and daily communication data. Unfortunately, they all differed in some respect from the conversations that are typ- ically written in messenger apps, e.g. they were too technical (IRC data), too long (comments data, transcription of meetings), lacked context (movie dialogues) or they were more of a spoken type, such as a dialogue between a petrol station assis- tant and a client buying petrol.
As a consequence, we decided to create a chat dialogue dataset by constructing such conversa- tions that would epitomize the style of a messenger app.
### Source Data
#### Initial Data Collection and Normalization
In paper:
> We asked linguists to create conversations similar to those they write on a daily basis, reflecting the proportion of topics of their real-life messenger conversations. It includes chit-chats, gossiping about friends, arranging meetings, discussing politics, consulting university assignments with colleagues, etc. Therefore, this dataset does not contain any sensitive data or fragments of other corpora.
#### Who are the source language producers?
linguists
### Annotations
#### Annotation process
In paper:
> Each dialogue was created by one person. After collecting all of the conversations, we asked language experts to annotate them with summaries, assuming that they should (1) be rather short, (2) extract important pieces of information, (3) include names of interlocutors, (4) be written in the third person. Each dialogue contains only one ref- erence summary.
#### Who are the annotators?
language experts
### Personal and Sensitive Information
None, see above: Initial Data Collection and Normalization
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
non-commercial licence: CC BY-NC-ND 4.0
### Citation Information
```
@inproceedings{gliwa-etal-2019-samsum,
title = "{SAMS}um Corpus: A Human-annotated Dialogue Dataset for Abstractive Summarization",
author = "Gliwa, Bogdan and
Mochol, Iwona and
Biesek, Maciej and
Wawer, Aleksander",
booktitle = "Proceedings of the 2nd Workshop on New Frontiers in Summarization",
month = nov,
year = "2019",
address = "Hong Kong, China",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/D19-5409",
doi = "10.18653/v1/D19-5409",
pages = "70--79"
}
```
### Contributions
Thanks to [@cccntu](https://github.com/cccntu) for adding this dataset. |
result-kand2-sdxl-wuerst-karlo/8a14fb4c | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 174
num_examples: 10
download_size: 1325
dataset_size: 174
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "8a14fb4c"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jxie/qg-tagging-normalized | ---
dataset_info:
features:
- name: inputs
sequence:
sequence: float64
- name: label
dtype: int64
splits:
- name: train
num_bytes: 6944726400
num_examples: 1600000
- name: val
num_bytes: 868957000
num_examples: 200000
- name: test
num_bytes: 868286700
num_examples: 200000
download_size: 3812296127
dataset_size: 8681970100
---
# Dataset Card for "qg-tagging-normalized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cryptonite | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- cc-by-nc-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
- 1K<n<10K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- open-domain-qa
paperswithcode_id: null
pretty_name: Cryptonite
dataset_info:
- config_name: default
features:
- name: agent_info
sequence:
- name: Bottomline
dtype: string
- name: Role
dtype: string
- name: Target
dtype: float32
- name: agent_turn
sequence: int32
- name: dialogue_acts
sequence:
- name: intent
dtype: string
- name: price
dtype: float32
- name: utterance
sequence: string
- name: items
sequence:
- name: Category
dtype: string
- name: Images
dtype: string
- name: Price
dtype: float32
- name: Description
dtype: string
- name: Title
dtype: string
splits:
- name: train
num_bytes: 8538836
num_examples: 5247
- name: test
num_bytes: 1353933
num_examples: 838
- name: validation
num_bytes: 966032
num_examples: 597
download_size: 25373618
dataset_size: 10858801
- config_name: cryptonite
features:
- name: clue
dtype: string
- name: answer
dtype: string
- name: enumeration
dtype: string
- name: publisher
dtype: string
- name: date
dtype: int64
- name: quick
dtype: bool
- name: id
dtype: string
splits:
- name: train
num_bytes: 52228597
num_examples: 470804
- name: validation
num_bytes: 2901768
num_examples: 26156
- name: test
num_bytes: 2908275
num_examples: 26157
download_size: 21615952
dataset_size: 58038640
config_names:
- cryptonite
- default
---
# Dataset Card for Cryptonite
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/aviaefrat/cryptonite)
- **Repository:** [Github](https://github.com/aviaefrat/cryptonite)
- **Paper:** [Arxiv](https://arxiv.org/pdf/2103.01242.pdf)
- **Leaderboard:**
- **Point of Contact:** [Twitter](https://twitter.com/AviaEfrat)
### Dataset Summary
Current NLP datasets targeting ambiguity can be solved by a native speaker with relative ease. We present Cryptonite, a large-scale dataset based on cryptic crosswords, which is both linguistically complex and naturally sourced. Each example in Cryptonite is a cryptic clue, a short phrase or sentence with a misleading surface reading, whose solving requires disambiguating semantic, syntactic, and phonetic wordplays, as well as world knowledge. Cryptic clues pose a challenge even for experienced solvers, though top-tier experts can solve them with almost 100% accuracy. Cryptonite is a challenging task for current models; fine-tuning T5-Large on 470k cryptic clues achieves only 7.6% accuracy, on par with the accuracy of a rule-based clue solver (8.6%).
### Languages
English
## Dataset Structure
### Data Instances
This is one example from the train set.
```python
{
'clue': 'make progress socially in stated region (5)',
'answer': 'climb',
'date': 971654400000,
'enumeration': '(5)',
'id': 'Times-31523-6across',
'publisher': 'Times',
'quick': False
}
```
### Data Fields
- `clue`: a string representing the clue provided for the crossword
- `answer`: a string representing the answer to the clue
- `enumeration`: a string representing the
- `publisher`: a string representing the publisher of the crossword
- `date`: a int64 representing the UNIX timestamp of the date of publication of the crossword
- `quick`: a bool representing whether the crossword is quick (a crossword aimed at beginners, easier to solve)
- `id`: a string to uniquely identify a given example in the dataset
### Data Splits
Train (470,804 examples), validation (26,156 examples), test (26,157 examples).
## Dataset Creation
### Curation Rationale
Crosswords from the Times and the Telegraph.
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Avia Efrat, Uri Shaham, Dan Kilman, Omer Levy
### Licensing Information
`cc-by-nc-4.0`
### Citation Information
```
@misc{efrat2021cryptonite,
title={Cryptonite: A Cryptic Crossword Benchmark for Extreme Ambiguity in Language},
author={Avia Efrat and Uri Shaham and Dan Kilman and Omer Levy},
year={2021},
eprint={2103.01242},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@theo-m](https://github.com/theo-m) for adding this dataset. |
rishitunu/ECC_crackdataset_withsplit | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 9770063.344
num_examples: 1031
- name: test
num_bytes: 2441398.0
num_examples: 258
download_size: 12018420
dataset_size: 12211461.344
---
# Dataset Card for "ECC_crackdataset_withsplit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alex5666/Military-Aircraft-Recognition-dataset | ---
license: apache-2.0
task_categories:
- image-classification
- image-segmentation
- image-to-text
- image-to-image
- object-detection
- depth-estimation
tags:
- legal
size_categories:
- 1M<n<10M
---
This is a remote sensing image Military Aircraft Recognition dataset that include 3842 images, 20 types, and 22341 instances annotated with horizontal bounding boxes and oriented bounding boxes. |
AlexWortega/pixels | ---
license: mit
---
|
Price11/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LucasThil/miniwob_plusplus_T5_basic | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: snippet
dtype: string
- name: actions
dtype: string
- name: uterances
dtype: string
splits:
- name: train
num_bytes: 8276379
num_examples: 14551
download_size: 1175429
dataset_size: 8276379
---
# Dataset Card for "miniwob_plusplus_T5_basic"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PaulineSanchez/Traduction_en_fr_food | ---
task_categories:
- translation
language:
- fr
- en
dataset_info:
features:
- name: alim_nom_fr
dtype: string
- name: alim_nom_eng
dtype: string
splits:
- name: train
num_bytes: 238948
num_examples: 3153
download_size: 114072
dataset_size: 238948
---
- info: This dataset comes from the ANSES-CIQUAL 2020 Table in English in XML format, found on https://www.data.gouv.fr/fr/datasets/table-de-composition-nutritionnelle-des-aliments-ciqual/ |
reciprocate/fair-eval | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: selected
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 179384
num_examples: 66
download_size: 117180
dataset_size: 179384
---
# Dataset Card for "fair-eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Crystalcareai/truthyDPO-intel | ---
license: cc-by-4.0
---
From https://huggingface.co/jondurbin - I just renamed one of the columns to make axolotl happier.
## Truthy DPO
This is a dataset designed to enhance the overall truthfulness of LLMs, without sacrificing immersion when roleplaying as a human.
For example, in normal AI assistant model, the model should not try to describe what the warmth of the sun feels like, but if the system prompt indicates it's a human, it should.
Mostly targets corporeal, spacial, temporal awareness, and common misconceptions.
### Contribute
If you're interested in new functionality/datasets, take a look at [bagel repo](https://github.com/jondurbin/bagel) and [airoboros](https://github.com/jondurbin/airoboros) and either make a PR or open an issue with details.
To help me with the fine-tuning costs, dataset generation, etc., please use one of the following:
- https://bmc.link/jondurbin
- ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11
- BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf |
sakun/facedepth | ---
license: afl-3.0
---
|
the-cramer-project/kyrgyz_MNIST | ---
license: cc-by-nc-4.0
language:
- en
- ky
pretty_name: MNIST like dataset for kyrgyz language
---
## EN: Kyrgyz language hand-written letters (kyrgyz MNIST)
A repository of images (in CSV format) of hand-written Kyrgyz alphabet letters for machine learning applications. Original images have been transformed to 50x50 images and after to csv format.
The repository currently consists of 80213 (50x50 pixel) images representing all 36 letters of the Kyrgyz alphabet These images have been hand-written.
### Kaggle competition:
For those who want to compete in Kaggle, HERE is the invitation: https://www.kaggle.com/t/e185ead3ba2f47509f0ca3a8dbec418e
### Acknowledgements
We thank Ilgiz Zhumaev for providing this dataset.
Original images (278х278) could be downloaded from: https://www.kaggle.com/datasets/ilgizzhumaev/database-of-36-handwritten-kyrgyz-letters?sort=votes
## KG: Кыргыз тилиндеги колго жазылган тамгалар (кыргыз MNIST)
Machine Learning жана жасалма интеллект колдонмолору үчүн кол менен жазылган кыргыз алиппесинин сүрөттөрүнүн репозиториясы. Оригалдуу сүрөттөр биринчинден 278x278-ден 50х50 пиксель сүрөткө которулган, анан csv форматына өзгөртүлгөн.
Учурда репозиторий кыргыз алиппесинин бардык 36 тамгасын чагылдырган 80213 (50x50 пиксель) сүрөттөрдөн турат. Бул сүрөттөр кол менен жазылган.
### Kaggle-дагы мелдеш
Kaggle мелдешине катышууну каалагандар үчүн бул жерде чакыруу: https://www.kaggle.com/t/e185ead3ba2f47509f0ca3a8dbec418e
### Ыраазычылык
Бул датасет топтомун бергени үчүн Илгиз Жумаевге ыраазычылык билдиребиз.
Оригиналдуу сүрөттөрдү (278х278) төмөнкү жерден жүктөп алса болот: https://www.kaggle.com/datasets/ilgizzhumaev/database-of-36-handwritten-kyrgyz-letters?sort=votes
## License
A big thanks to all the contributors: Timur Turatali, Ilgiz Zhumaev, Ulan Abdurazakov, Nursultan Bakashov, Altynai Mambetova, Meerim Abdrakhmanova.
Kyrgyz MNIST dataset is licensed under a [Attribution-NonCommercial 4.0 International (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/)
---
license: cc-by-nc-4.0
--- |
chronbmm/sanskrit-sandhi-split-sighum | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: unsandhied
dtype: string
splits:
- name: train
num_bytes: 10973642
num_examples: 99889
- name: validation
num_bytes: 470141
num_examples: 4200
- name: test
num_bytes: 470141
num_examples: 4200
- name: test_500
num_bytes: 58711
num_examples: 500
- name: validation_500
num_bytes: 58711
num_examples: 500
download_size: 7463353
dataset_size: 12031346
---
# Dataset Card for "sanskrit-sandhi-split-sighum"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sammarieo/asr_capstone_test | ---
dataset_info:
features:
- name: path
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 4118361.8494623657
num_examples: 19
download_size: 3257710
dataset_size: 4118361.8494623657
---
# Dataset Card for "asr_capstone_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JackismyShephard/nst-da-norm | ---
annotations_creators:
- machine-generated
- expert-generated
language_creators:
- expert-generated
language: da
license: cc0-1.0
multilinguality: monolingual
size_categories: 100K<n<1M
source_datasets: extended
task_categories:
- automatic-speech-recognition
- text-to-speech
pretty_name: NST-da Normalized
---
# Dataset Card for NST-da Normalized
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** da
- **License:** cc0-1.0
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
9wimu9/news_first_dataset | ---
dataset_info:
features:
- name: date
dtype: string
- name: title
dtype: string
- name: short_title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 68922136
num_examples: 28680
download_size: 25841744
dataset_size: 68922136
---
# Dataset Card for "news_first_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/VQAv2_sample_validation_benchmarks_partition_0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 63
num_examples: 2
download_size: 1388
dataset_size: 63
---
# Dataset Card for "VQAv2_sample_validation_benchmarks_partition_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MiXaiLL76/ru_youtube_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: text_len
dtype: int64
- name: language
dtype: string
- name: speaker_name
dtype: string
- name: duration
dtype: float64
- name: audio
dtype: audio
- name: file_size
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 109616973.04
num_examples: 1630
download_size: 112127863
dataset_size: 109616973.04
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
didi0di/KoWoW | ---
license: mit
task_categories:
- text-generation
pretty_name: KoWoW
size_categories:
- 100K<n<1M
language:
- ko
tags:
- chat
- mutiturn conversation
- chatbot
---
# Dataset Card for KoWoW
<!-- Provide a quick summary of the dataset. -->
## Dataset Summary
WoW(Wiard of Wikipedia)를 한국어로 변역한 데이터입니다.
### Dataset Description
WoW(Wiard of Wikipedia)라는 지식 기반 대화 데이터를 한국어로 변역한 데이터입니다.
한 대화에 여러 개의 dialog가 묶음으로 구성되어 있으며, 전체 대화는 22,311건, 전체 dialog는 201,999개 입니다.
본 데이터셋은 Knowledge와 Utterance가 모두 한국어인 ko 버전만 가져온 데이터입니다.
- **Language(s) (NLP):** ko
- **License:** mit
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://github.com/AIRC-KETI/kowow/tree/master
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
WoW(Wiard of Wikipedia)
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
- AIRC-KETI |
jkv53/13F_Reports | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 12636095
num_examples: 1113
download_size: 3367995
dataset_size: 12636095
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "13F_Reports"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AravindVadlapudi02/UA_speech_noisereduced_test-0.7_train-0.3 | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': healthy control
'1': pathology
- name: input_features
sequence:
sequence: float32
splits:
- name: train
num_bytes: 1609516432
num_examples: 1676
- name: test
num_bytes: 3757779116
num_examples: 3913
download_size: 619867591
dataset_size: 5367295548
---
# Dataset Card for "UA_speech_noisereduced_test-0.7_train-0.3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
timothyckl/ToolBenchPlanner | ---
license: mit
dataset_info:
features:
- name: target
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 2547893918
num_examples: 490775
download_size: 483781458
dataset_size: 2547893918
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_internlm__internlm-20b-chat | ---
pretty_name: Evaluation run of internlm/internlm-20b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [internlm/internlm-20b-chat](https://huggingface.co/internlm/internlm-20b-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_internlm__internlm-20b-chat_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-08T17:10:14.815999](https://huggingface.co/datasets/open-llm-leaderboard/details_internlm__internlm-20b-chat_public/blob/main/results_2023-11-08T17-10-14.815999.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.016149328859060404,\n\
\ \"em_stderr\": 0.001290866955681033,\n \"f1\": 0.10652579697986504,\n\
\ \"f1_stderr\": 0.0024253657455140664,\n \"acc\": 0.4874752643801689,\n\
\ \"acc_stderr\": 0.011119649443380293\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.016149328859060404,\n \"em_stderr\": 0.001290866955681033,\n\
\ \"f1\": 0.10652579697986504,\n \"f1_stderr\": 0.0024253657455140664\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18726307808946172,\n \
\ \"acc_stderr\": 0.01074591419951081\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7876874506708761,\n \"acc_stderr\": 0.011493384687249779\n\
\ }\n}\n```"
repo_url: https://huggingface.co/internlm/internlm-20b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_08T17_10_14.815999
path:
- '**/details_harness|drop|3_2023-11-08T17-10-14.815999.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-08T17-10-14.815999.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_08T17_10_14.815999
path:
- '**/details_harness|gsm8k|5_2023-11-08T17-10-14.815999.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-08T17-10-14.815999.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_08T17_10_14.815999
path:
- '**/details_harness|winogrande|5_2023-11-08T17-10-14.815999.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-08T17-10-14.815999.parquet'
- config_name: results
data_files:
- split: 2023_11_08T17_10_14.815999
path:
- results_2023-11-08T17-10-14.815999.parquet
- split: latest
path:
- results_2023-11-08T17-10-14.815999.parquet
---
# Dataset Card for Evaluation run of internlm/internlm-20b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/internlm/internlm-20b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [internlm/internlm-20b-chat](https://huggingface.co/internlm/internlm-20b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_internlm__internlm-20b-chat_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-08T17:10:14.815999](https://huggingface.co/datasets/open-llm-leaderboard/details_internlm__internlm-20b-chat_public/blob/main/results_2023-11-08T17-10-14.815999.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.016149328859060404,
"em_stderr": 0.001290866955681033,
"f1": 0.10652579697986504,
"f1_stderr": 0.0024253657455140664,
"acc": 0.4874752643801689,
"acc_stderr": 0.011119649443380293
},
"harness|drop|3": {
"em": 0.016149328859060404,
"em_stderr": 0.001290866955681033,
"f1": 0.10652579697986504,
"f1_stderr": 0.0024253657455140664
},
"harness|gsm8k|5": {
"acc": 0.18726307808946172,
"acc_stderr": 0.01074591419951081
},
"harness|winogrande|5": {
"acc": 0.7876874506708761,
"acc_stderr": 0.011493384687249779
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_KoboldAI__fairseq-dense-125M | ---
pretty_name: Evaluation run of KoboldAI/fairseq-dense-125M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KoboldAI/fairseq-dense-125M](https://huggingface.co/KoboldAI/fairseq-dense-125M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__fairseq-dense-125M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T04:17:04.516677](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__fairseq-dense-125M/blob/main/results_2023-10-19T04-17-04.516677.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.016149328859060404,\n\
\ \"em_stderr\": 0.001290866955681,\n \"f1\": 0.0550492869127517,\n\
\ \"f1_stderr\": 0.0017708903745179366,\n \"acc\": 0.25295974743488553,\n\
\ \"acc_stderr\": 0.007025750419242903\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.016149328859060404,\n \"em_stderr\": 0.001290866955681,\n\
\ \"f1\": 0.0550492869127517,\n \"f1_stderr\": 0.0017708903745179366\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5059194948697711,\n\
\ \"acc_stderr\": 0.014051500838485807\n }\n}\n```"
repo_url: https://huggingface.co/KoboldAI/fairseq-dense-125M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T04_17_04.516677
path:
- '**/details_harness|drop|3_2023-10-19T04-17-04.516677.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T04-17-04.516677.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T04_17_04.516677
path:
- '**/details_harness|gsm8k|5_2023-10-19T04-17-04.516677.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T04-17-04.516677.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:55:37.353557.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:55:37.353557.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:55:37.353557.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T04_17_04.516677
path:
- '**/details_harness|winogrande|5_2023-10-19T04-17-04.516677.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T04-17-04.516677.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_55_37.353557
path:
- results_2023-07-19T13:55:37.353557.parquet
- split: 2023_10_19T04_17_04.516677
path:
- results_2023-10-19T04-17-04.516677.parquet
- split: latest
path:
- results_2023-10-19T04-17-04.516677.parquet
---
# Dataset Card for Evaluation run of KoboldAI/fairseq-dense-125M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KoboldAI/fairseq-dense-125M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KoboldAI/fairseq-dense-125M](https://huggingface.co/KoboldAI/fairseq-dense-125M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KoboldAI__fairseq-dense-125M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T04:17:04.516677](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__fairseq-dense-125M/blob/main/results_2023-10-19T04-17-04.516677.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.016149328859060404,
"em_stderr": 0.001290866955681,
"f1": 0.0550492869127517,
"f1_stderr": 0.0017708903745179366,
"acc": 0.25295974743488553,
"acc_stderr": 0.007025750419242903
},
"harness|drop|3": {
"em": 0.016149328859060404,
"em_stderr": 0.001290866955681,
"f1": 0.0550492869127517,
"f1_stderr": 0.0017708903745179366
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5059194948697711,
"acc_stderr": 0.014051500838485807
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kammavidya/AI | ---
task_categories:
- question-answering
language:
- en
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tuanacanal/conversation-ds-3 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 239645.0534228188
num_examples: 3352
- name: validation
num_bytes: 26666.946577181207
num_examples: 373
download_size: 183025
dataset_size: 266312.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_15_10000000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 192585
num_examples: 6699
download_size: 123664
dataset_size: 192585
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_15_10000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/Inflection-Benchmarks | ---
configs:
- config_name: mt_bench_inf
data_files:
- split: train
path: data/mt_bench_inf.jsonl
- config_name: physics_gre
data_files:
- split: train
path: data/physics_gre.jsonl
- config_name: physics_gre_scored
data_files:
- split: train
path: data/physics_gre_scored.jsonl
license: mit
---
Cloned from https://github.com/InflectionAI/Inflection-Benchmarks
# MT-Bench Inf
In `mt_bench_inf.jsonl` we release a corrected version of the [MT-Bench](https://github.com/lm-sys/FastChat?tab=readme-ov-file#evaluation) questions that we use for evaluation. Each entry has the following fields:
* `question_id`: The question number
* `category`: Which MT-Bench category
* `turn`: A list with the turns
* `reference [optional]`: A reference answer
Below, we show a few examples of questions, the original GPT-4 Reference answer, and our corrected answer:
### Reasoning #104
Question:
```
David has three sisters. Each of them has one brother. How many brothers does David have?
```
GPT-4 Reference Answer:
```
David has only one brother.
```
Our Corrected Answer:
```
David has no brother. He is the one brother of his three sisters.
```
### Math #111
Question:
```
The vertices of a triangle are at points (0, 0), (-1, 1), and (3, 3). What is the area of the triangle?
```
GPT-4 Reference Answer:
```
To find the area of the triangle, we can use the formula for the area of a triangle with coordinates A(x1, y1), B(x2, y2), and C(x3, y3):
Area = (1/2) * |(x1(y2 - y3) + x2(y3 - y1) + x3(y1 - y2))|
In this case, the coordinates are A(0, 0), B(-1, 1), and C(3, 3). Plugging these values into the formula, we get:
Area = (1/2) * |(0(1 - 3) + (-1)(3 - 0) + 3(0 - 1))|
Area = (1/2) * |(-0 + 3 - 3)|
Area = (1/2) * |0|
Area = 0
The area of the triangle is 0. This means that the three points are collinear, and there is no triangle formed by these points.
```
Our Corrected Answer:
```
Area is 3
```
Overall, we find nearly 25% of examples in the reasoning, math, and coding categories had incorrect reference solutions or had flawed premises.
# Physics GRE
We release four processed Physics GRE exams, a common graduate school entrance exam for physics students.
* `physics_gre_scored.jsonl`: Exam GR8677 which we report results on.
* `physics_gre.jsonl`: Three aditional Physics GRE exams (GR9277, GR9677, and GR0177).
Each entry has the following fields:
* `input`: The question
* `target_scores`: The correct answer choice
* `has_image`: Whether or not the question has an image. We include only questions without an image in our scoring.
| Model | Percentile |
| ----------------------| ---------: |
| Inflection-2.5 maj@8 | 85 |
| Inflection-2.5 maj@32 | 95 |
| GPT-4 maj@8 | 97 |
## Exam Scoring Details
For the Physics GRE, each correct answer is worth 1 point and each incorrect answer results in a -0.25 reduction.
To compute score, we make the following assumption:
```
Raw_Score = Percentage_Correct - 0.25 * (1 - Percentage_Correct)
```
where `Percentage_Correct` is computed purely on questions without images. For simplicity, we do not use heuristics to allow the model not to answer.
| Raw Score | Percentile |
| -----------: | ---------: |
| 81 – 100 | 98 |
| 77 – 80 | 97 |
| 75 – 76 | 96 |
| 72 – 74 | 95 |
| 71 | 94 |
| 69 – 70 | 93 |
| 67 – 68 | 92 |
| 65 – 66 | 91 |
| 64 | 90 |
| 63 | 89 |
| 61 – 62 | 87 |
| 60 | 86 |
| 59 | 85 |
| 57 – 58 | 84 |
| 56 | 82 |
| 55 | 80 |
| 53 – 54 | 78 |
| 52 | 77 |
| 51 | 75 |
| 49 – 50 | 72 |
| 48 | 70 |
| 47 | 69 |
| 45 – 46 | 66 |
| 44 | 64 |
| 43 | 62 |
| 41 – 42 | 59 |
| 40 | 57 |
| 39 | 54 |
| 37 – 38 | 52 |
| 36 | 48 |
| 35 | 46 |
| 33 – 34 | 43 |
| 32 | 41 |
| 30 – 31 | 38 |
| 29 | 35 |
| 28 | 32 |
| 26 – 27 | 30 |
| 25 | 27 |
| 24 | 25 |
| 22 – 23 | 22 |
| 21 | 20 |
| 20 | 18 |
| 18 – 19 | 16 |
| 17 | 14 |
| 16 | 12 |
| 14 – 15 | 10 |
| 13 | 9 |
| 12 | 8 |
| 10 – 11 | 6 |
| 9 | 5 |
| 8 | 4 |
| 6 – 7 | 3 |
| 5 | 2 |
| 1 – 4 | 1 |
| 0 | 0 | |
Nadav/CaribbeanScans | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': evaluation
'1': train
splits:
- name: train
num_bytes: 152948913099.784
num_examples: 1675172
- name: test
num_bytes: 9056919525.81
num_examples: 87721
download_size: 57344797328
dataset_size: 162005832625.594
---
# Dataset Card for "CaribbeanScans"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/VQAv2_minival_validation_google_flan_t5_xxl_mode_Q_rices_ns_25994 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_clean_
num_bytes: 7009336
num_examples: 25994
download_size: 1921170
dataset_size: 7009336
---
# Dataset Card for "VQAv2_minival_validation_google_flan_t5_xxl_mode_Q_rices_ns_25994"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OGOFML/test_embeddings_qa_esma | ---
license: unlicense
---
|
commaai/commaCarSegments | ---
pretty_name: "Comma Car Segments"
tags:
- comma
- openpilot
---
# Comma Car Segments
public dataset of sanetized car segments for each car that [openpilot](https://github.com/commaai/openpilot) supports, only containing can data, pandaStates, and carParams. This allows you to perform validation across a large database when doing car ports and making changes for particular platforms. |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_1.4b_bo2_100_kl_0.1_prm_410m_thr_1.0_seed_2 | ---
dataset_info:
config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43590759
num_examples: 18929
- name: epoch_1
num_bytes: 43801606
num_examples: 18929
- name: epoch_2
num_bytes: 43769566
num_examples: 18929
- name: epoch_3
num_bytes: 43712983
num_examples: 18929
- name: epoch_4
num_bytes: 43665131
num_examples: 18929
- name: epoch_5
num_bytes: 43651095
num_examples: 18929
- name: epoch_6
num_bytes: 43628456
num_examples: 18929
- name: epoch_7
num_bytes: 43622112
num_examples: 18929
- name: epoch_8
num_bytes: 43619791
num_examples: 18929
- name: epoch_9
num_bytes: 43613447
num_examples: 18929
download_size: 232039066
dataset_size: 436674946
configs:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: epoch_0
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_9-*
---
|
louisbrulenaudet/code-organisation-judiciaire | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code de l'organisation judiciaire
source_datasets:
- original
pretty_name: Code de l'organisation judiciaire
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code de l'organisation judiciaire, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
cstorm125/llama_train_scb_mt_enth_2020 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 512953705.5828189
num_examples: 1000750
- name: test
num_bytes: 513594.4171810987
num_examples: 1002
download_size: 256567201
dataset_size: 513467300.0
---
# Dataset Card for "llama_train_scb_mt_enth_2020"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deven367/babylm-100M-qed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 59840684
num_examples: 959659
- name: valid
num_bytes: 5678320
num_examples: 94976
- name: test
num_bytes: 7027994
num_examples: 114964
download_size: 44263522
dataset_size: 72546998
---
# Dataset Card for "babylm-100M-qed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/tamaki_ui_puellamagimadokamagicasidestorymagiarecord | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Tamaki Ui
This is the dataset of Tamaki Ui, containing 59 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 59 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 139 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 59 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 59 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 59 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 59 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 59 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 139 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 139 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 139 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CorticalStack/tak-stack-dpo | ---
language:
- en
license: apache-2.0
task_categories:
- text-generation
pretty_name: tak-stack-dpo
dataset_info:
features:
- name: source
dtype: string
- name: instruction
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
num_examples: 10856
dataset_size: 14401432 bytes
tags:
- synthetic
- dpo
---
<img src="neurotic-crown-clown-tak-stack.png" alt="Neurotic crown clown tak stack logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
# tak-stack-dpo 🧠
A DPO alignment dataset for fine tuning open-source LLMs, taking sample preference pairs from a variety of datasets for diversity.
Prepared in the "standard" instruction, chosen, and rejected format, with a source feature indicating from which dataset the row was extracted.
Source datasets:
- [argilla/distilabel-math-preference-dpo](https://huggingface.co/argilla/distilabel-math-preference-dpo)
- [jondurbin/truthy-dpo-v0.1](https://huggingface.co/jondurbin/truthy-dpo-v0.1)
- [argilla/distilabel-intel-orca-dpo-pairs](https://huggingface.co/argilla/distilabel-intel-orca-dpo-pairs)
- [argilla/OpenHermes2.5-dpo-binarized-alpha](https://huggingface.co/argilla/OpenHermes2.5-dpo-binarized-alpha) |
bigscience-data/roots_indic-ur_ted_talks_iwslt | ---
language: ur
license: cc-by-nc-nd-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-ur_ted_talks_iwslt
# WIT Ted Talks
- Dataset uid: `ted_talks_iwslt`
### Description
The Web Inventory Talk is a collection of the original Ted talks and their translated version. The translations are available in more than 109+ languages, though the distribution is not uniform.
### Homepage
https://github.com/huggingface/datasets/blob/master/datasets/ted_talks_iwslt/README.md
### Licensing
- open license
- cc-by-nc-4.0: Creative Commons Attribution Non Commercial 4.0 International
TED makes its collection of video recordings and transcripts of talks available under the Creative Commons BY-NC-ND license (look here). WIT3 acknowledges the authorship of TED talks (BY condition) and does not redistribute transcripts for commercial purposes (NC). As regards the integrity of the work (ND), WIT3 only changes the format of the container, while preserving the original contents. WIT3 aims to support research on human language processing as well as the diffusion of TED Talks!
### Speaker Locations
- Southern Europe
- Italy
### Sizes
- 0.0305 % of total
- 0.0736 % of ar
- 0.2002 % of pt
- 0.0128 % of zh
- 0.2236 % of vi
- 0.0330 % of fr
- 0.0545 % of es
- 0.0122 % of en
- 0.3704 % of id
- 0.0373 % of indic-hi
- 0.0330 % of indic-ta
- 0.1393 % of indic-mr
- 0.0305 % of ca
- 0.1179 % of indic-ur
- 0.0147 % of indic-bn
- 0.0240 % of indic-ml
- 0.0244 % of indic-te
- 0.0503 % of indic-gu
- 0.0211 % of indic-kn
- 0.0274 % of eu
- 0.0023 % of indic-as
- 0.0001 % of indic-pa
### BigScience processing steps
#### Filters applied to: ar
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: pt
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: zh
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: id
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: ca
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ur
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-as
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-pa
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
thjeon/Customkor | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7786
num_examples: 32
download_size: 4172
dataset_size: 7786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
healthcorum/autotrain-data-reformatted_test | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: autotrain_text
dtype: string
- name: autotrain_label
dtype: string
splits:
- name: train
num_bytes: 9500675
num_examples: 7998
- name: validation
num_bytes: 2375962
num_examples: 2000
download_size: 4169235
dataset_size: 11876637
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "autotrain-data-reformatted_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jeevan-23/diff_formats_split_ds | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: string
- name: query
struct:
- name: question
dtype: string
- name: answers
sequence: string
- name: words
sequence: string
- name: bounding_boxes
sequence:
sequence: int64
splits:
- name: train
num_bytes: 1046177127.1854271
num_examples: 1100
- name: test
num_bytes: 467926343.91457283
num_examples: 492
download_size: 1510652483
dataset_size: 1514103471.1
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
bskang/CVPR2023_title_abstract_intro | ---
dataset_info:
features:
- name: title
dtype: string
- name: abstract
dtype: string
- name: introduction
dtype: string
splits:
- name: train
num_bytes: 17145949
num_examples: 2335
download_size: 9340289
dataset_size: 17145949
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "CVPR2023_title_abstract_intro"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
damerajee/hindi-dpo | ---
license: mit
---
|
fugueweb/foc-webscrape | ---
license: mit
---
|
open-llm-leaderboard/details_allknowingroger__PercivalMelodias-7B-slerp | ---
pretty_name: Evaluation run of allknowingroger/PercivalMelodias-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [allknowingroger/PercivalMelodias-7B-slerp](https://huggingface.co/allknowingroger/PercivalMelodias-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_allknowingroger__PercivalMelodias-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-10T22:03:47.023633](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__PercivalMelodias-7B-slerp/blob/main/results_2024-04-10T22-03-47.023633.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6510593961453048,\n\
\ \"acc_stderr\": 0.0320723947452529,\n \"acc_norm\": 0.6499611749770466,\n\
\ \"acc_norm_stderr\": 0.032749688186930324,\n \"mc1\": 0.6328029375764994,\n\
\ \"mc1_stderr\": 0.016874805001453184,\n \"mc2\": 0.7796161514683636,\n\
\ \"mc2_stderr\": 0.013706059726721979\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7107508532423208,\n \"acc_stderr\": 0.013250012579393441,\n\
\ \"acc_norm\": 0.726962457337884,\n \"acc_norm_stderr\": 0.013019332762635753\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7167894841665007,\n\
\ \"acc_stderr\": 0.0044963697421321015,\n \"acc_norm\": 0.8910575582553276,\n\
\ \"acc_norm_stderr\": 0.003109302300176206\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n\
\ \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n\
\ \"acc_stderr\": 0.016563829399047703,\n \"acc_norm\": 0.4312849162011173,\n\
\ \"acc_norm_stderr\": 0.016563829399047703\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n\
\ \"acc_stderr\": 0.012753716929101006,\n \"acc_norm\": 0.4745762711864407,\n\
\ \"acc_norm_stderr\": 0.012753716929101006\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6328029375764994,\n\
\ \"mc1_stderr\": 0.016874805001453184,\n \"mc2\": 0.7796161514683636,\n\
\ \"mc2_stderr\": 0.013706059726721979\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8500394632991318,\n \"acc_stderr\": 0.010034394804580809\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7073540561031084,\n \
\ \"acc_stderr\": 0.0125323343682429\n }\n}\n```"
repo_url: https://huggingface.co/allknowingroger/PercivalMelodias-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|arc:challenge|25_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|gsm8k|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hellaswag|10_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T22-03-47.023633.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T22-03-47.023633.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- '**/details_harness|winogrande|5_2024-04-10T22-03-47.023633.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-10T22-03-47.023633.parquet'
- config_name: results
data_files:
- split: 2024_04_10T22_03_47.023633
path:
- results_2024-04-10T22-03-47.023633.parquet
- split: latest
path:
- results_2024-04-10T22-03-47.023633.parquet
---
# Dataset Card for Evaluation run of allknowingroger/PercivalMelodias-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [allknowingroger/PercivalMelodias-7B-slerp](https://huggingface.co/allknowingroger/PercivalMelodias-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_allknowingroger__PercivalMelodias-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-10T22:03:47.023633](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__PercivalMelodias-7B-slerp/blob/main/results_2024-04-10T22-03-47.023633.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6510593961453048,
"acc_stderr": 0.0320723947452529,
"acc_norm": 0.6499611749770466,
"acc_norm_stderr": 0.032749688186930324,
"mc1": 0.6328029375764994,
"mc1_stderr": 0.016874805001453184,
"mc2": 0.7796161514683636,
"mc2_stderr": 0.013706059726721979
},
"harness|arc:challenge|25": {
"acc": 0.7107508532423208,
"acc_stderr": 0.013250012579393441,
"acc_norm": 0.726962457337884,
"acc_norm_stderr": 0.013019332762635753
},
"harness|hellaswag|10": {
"acc": 0.7167894841665007,
"acc_stderr": 0.0044963697421321015,
"acc_norm": 0.8910575582553276,
"acc_norm_stderr": 0.003109302300176206
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047703,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047703
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.012753716929101006,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.012753716929101006
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6328029375764994,
"mc1_stderr": 0.016874805001453184,
"mc2": 0.7796161514683636,
"mc2_stderr": 0.013706059726721979
},
"harness|winogrande|5": {
"acc": 0.8500394632991318,
"acc_stderr": 0.010034394804580809
},
"harness|gsm8k|5": {
"acc": 0.7073540561031084,
"acc_stderr": 0.0125323343682429
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
FanChen0116/19100_chat_8x_slot_pvi | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-time
'2': B-date
'3': B-last_name
'4': B-people
'5': I-date
'6': I-people
'7': I-last_name
'8': I-first_name
'9': B-first_name
'10': B-time
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 77646
num_examples: 493
- name: validation
num_bytes: 4887
num_examples: 32
- name: test
num_bytes: 570513
num_examples: 3731
download_size: 18080
dataset_size: 653046
---
# Dataset Card for "19100_chat_8x_slot_pvi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
waliiid/your_dataset_www | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: Input
dtype: string
- name: Name
dtype: string
- name: Description
dtype: string
- name: Temperature
dtype: string
splits:
- name: train
num_bytes: 3720
num_examples: 15
download_size: 4656
dataset_size: 3720
---
# Dataset Card for "your_dataset_www"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kalcho100/flippy_sampled_dataset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 92783189.1920878
num_examples: 67723
- name: test
num_bytes: 23196139.80791221
num_examples: 16931
download_size: 62515456
dataset_size: 115979329.0
---
# Dataset Card for "flippy_sampled_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nz/lichess_data_with_move_numbers | ---
dataset_info:
features:
- name: result
dtype: string
- name: white_elo
dtype: string
- name: black_elo
dtype: string
- name: termination
dtype: string
- name: moves
dtype: string
- name: source
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 325591474621
num_examples: 685036846
download_size: 183644833120
dataset_size: 325591474621
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
akbarali506/legal-data | ---
license: unknown
---
|
Teklia/CASIA-HWDB2-line | ---
license: mit
language:
- zh
task_categories:
- image-to-text
pretty_name: CASIA-HWDB2-line
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_examples: 33401
- name: validation
num_examples: 8318
- name: test
num_examples: 10441
dataset_size: 52160
tags:
- atr
- htr
- ocr
- modern
- handwritten
---
# CASIA-HWDB2 - line level
## Table of Contents
- [CASIA-HWDB2 - line level](#casia-hwdb2-line-level)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
## Dataset Description
- **Homepage:** [CASIA-HWDB2](http://www.nlpr.ia.ac.cn/databases/handwriting/Download.html)
- **Paper:** [Online and offline handwritten Chinese character recognition: Benchmarking on new databases](https://www.sciencedirect.com/science/article/abs/pii/S0031320312002919)
- **Point of Contact:** [TEKLIA](https://teklia.com)
## Dataset Summary
The offline Chinese handwriting database (CASIA-HWDB2) was built by the National Laboratory of Pattern Recognition (NLPR), Institute of Automation of Chinese Academy of Sciences (CASIA).
The handwritten samples were produced by 1,020 writers using Anoto pen on papers, such that both online and offline data were obtained.
Note that all images are resized to a fixed height of 128 pixels.
### Languages
All the documents in the dataset are written in Chinese.
## Dataset Structure
### Data Instances
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=1244x128 at 0x1A800E8E190,
'text': '2007年高校招生录取工作即将陆续展开,教育部有关负责人'
}
```
### Data Fields
- `image`: a PIL.Image.Image object containing the image. Note that when accessing the image column (using dataset[0]["image"]), the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the "image" column, i.e. dataset[0]["image"] should always be preferred over dataset["image"][0].
- `text`: the label transcription of the image. |
AdapterOcean/code_instructions_standardized_cluster_19 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 52414584
num_examples: 5187
download_size: 14907077
dataset_size: 52414584
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_19"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hungarians/Samples | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Cheng98__llama-39m | ---
pretty_name: Evaluation run of Cheng98/llama-39m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Cheng98/llama-39m](https://huggingface.co/Cheng98/llama-39m) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Cheng98__llama-39m\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T12:54:56.361777](https://huggingface.co/datasets/open-llm-leaderboard/details_Cheng98__llama-39m/blob/main/results_2024-03-21T12-54-56.361777.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24266373172846512,\n\
\ \"acc_stderr\": 0.03036219204331094,\n \"acc_norm\": 0.24328328699147694,\n\
\ \"acc_norm_stderr\": 0.03116947755557392,\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826844,\n \"mc2\": 0.47187304526757556,\n\
\ \"mc2_stderr\": 0.01576900200671721\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19283276450511946,\n \"acc_stderr\": 0.011529055465663334,\n\
\ \"acc_norm\": 0.24061433447098976,\n \"acc_norm_stderr\": 0.012491468532390568\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2566221868153754,\n\
\ \"acc_stderr\": 0.00435876459640103,\n \"acc_norm\": 0.2557259510057757,\n\
\ \"acc_norm_stderr\": 0.00435376873064457\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.037857144650666544,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.037857144650666544\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.03279000406310052,\n\
\ \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.03279000406310052\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.0256042334708991,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.0256042334708991\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.14,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.033450369167889904,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.033450369167889904\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.02865917937429232,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.02865917937429232\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776557,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776557\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523812,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523812\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2709677419354839,\n \"acc_stderr\": 0.02528441611490016,\n \"\
acc_norm\": 0.2709677419354839,\n \"acc_norm_stderr\": 0.02528441611490016\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2561576354679803,\n \"acc_stderr\": 0.030712730070982592,\n \"\
acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.030712730070982592\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n\
\ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.30808080808080807,\n \"acc_stderr\": 0.032894773300986155,\n \"\
acc_norm\": 0.30808080808080807,\n \"acc_norm_stderr\": 0.032894773300986155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2641025641025641,\n \"acc_stderr\": 0.022352193737453285,\n\
\ \"acc_norm\": 0.2641025641025641,\n \"acc_norm_stderr\": 0.022352193737453285\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.030283995525884396,\n\
\ \"acc_norm\": 0.31932773109243695,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603826,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603826\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22752293577981653,\n \"acc_stderr\": 0.017974463578776502,\n \"\
acc_norm\": 0.22752293577981653,\n \"acc_norm_stderr\": 0.017974463578776502\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18055555555555555,\n \"acc_stderr\": 0.026232878971491656,\n \"\
acc_norm\": 0.18055555555555555,\n \"acc_norm_stderr\": 0.026232878971491656\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21524663677130046,\n\
\ \"acc_stderr\": 0.027584066602208263,\n \"acc_norm\": 0.21524663677130046,\n\
\ \"acc_norm_stderr\": 0.027584066602208263\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822586,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822586\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2822477650063857,\n\
\ \"acc_stderr\": 0.016095302969878544,\n \"acc_norm\": 0.2822477650063857,\n\
\ \"acc_norm_stderr\": 0.016095302969878544\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.022497230190967558,\n\
\ \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.022497230190967558\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574885,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574885\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19935691318327975,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.19935691318327975,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.024477222856135107,\n\
\ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.024477222856135107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872405,\n \
\ \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n\
\ \"acc_stderr\": 0.011025499291443737,\n \"acc_norm\": 0.24771838331160365,\n\
\ \"acc_norm_stderr\": 0.011025499291443737\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.15808823529411764,\n \"acc_stderr\": 0.02216146260806851,\n\
\ \"acc_norm\": 0.15808823529411764,\n \"acc_norm_stderr\": 0.02216146260806851\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2434640522875817,\n \"acc_stderr\": 0.017362473762146634,\n \
\ \"acc_norm\": 0.2434640522875817,\n \"acc_norm_stderr\": 0.017362473762146634\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782834,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.025206963154225395,\n\
\ \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.025206963154225395\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n\
\ \"acc_stderr\": 0.028996909693328927,\n \"acc_norm\": 0.21393034825870647,\n\
\ \"acc_norm_stderr\": 0.028996909693328927\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064537,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064537\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826844,\n \"mc2\": 0.47187304526757556,\n\
\ \"mc2_stderr\": 0.01576900200671721\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076903\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Cheng98/llama-39m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|arc:challenge|25_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|gsm8k|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hellaswag|10_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T12-54-56.361777.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T12-54-56.361777.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- '**/details_harness|winogrande|5_2024-03-21T12-54-56.361777.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T12-54-56.361777.parquet'
- config_name: results
data_files:
- split: 2024_03_21T12_54_56.361777
path:
- results_2024-03-21T12-54-56.361777.parquet
- split: latest
path:
- results_2024-03-21T12-54-56.361777.parquet
---
# Dataset Card for Evaluation run of Cheng98/llama-39m
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Cheng98/llama-39m](https://huggingface.co/Cheng98/llama-39m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Cheng98__llama-39m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T12:54:56.361777](https://huggingface.co/datasets/open-llm-leaderboard/details_Cheng98__llama-39m/blob/main/results_2024-03-21T12-54-56.361777.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24266373172846512,
"acc_stderr": 0.03036219204331094,
"acc_norm": 0.24328328699147694,
"acc_norm_stderr": 0.03116947755557392,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826844,
"mc2": 0.47187304526757556,
"mc2_stderr": 0.01576900200671721
},
"harness|arc:challenge|25": {
"acc": 0.19283276450511946,
"acc_stderr": 0.011529055465663334,
"acc_norm": 0.24061433447098976,
"acc_norm_stderr": 0.012491468532390568
},
"harness|hellaswag|10": {
"acc": 0.2566221868153754,
"acc_stderr": 0.00435876459640103,
"acc_norm": 0.2557259510057757,
"acc_norm_stderr": 0.00435376873064457
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.037857144650666544,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.037857144650666544
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.03279000406310052,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.03279000406310052
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.0256042334708991,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.0256042334708991
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3125,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.14,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.14,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.033450369167889904,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.033450369167889904
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.02865917937429232,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.02865917937429232
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776557,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776557
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523812,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523812
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2709677419354839,
"acc_stderr": 0.02528441611490016,
"acc_norm": 0.2709677419354839,
"acc_norm_stderr": 0.02528441611490016
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.030712730070982592,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.030712730070982592
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548299,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548299
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30808080808080807,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.30808080808080807,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32124352331606215,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.32124352331606215,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2641025641025641,
"acc_stderr": 0.022352193737453285,
"acc_norm": 0.2641025641025641,
"acc_norm_stderr": 0.022352193737453285
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31932773109243695,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.31932773109243695,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603826,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.036030385453603826
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22752293577981653,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.22752293577981653,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18055555555555555,
"acc_stderr": 0.026232878971491656,
"acc_norm": 0.18055555555555555,
"acc_norm_stderr": 0.026232878971491656
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21524663677130046,
"acc_stderr": 0.027584066602208263,
"acc_norm": 0.21524663677130046,
"acc_norm_stderr": 0.027584066602208263
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822586,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822586
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2822477650063857,
"acc_stderr": 0.016095302969878544,
"acc_norm": 0.2822477650063857,
"acc_norm_stderr": 0.016095302969878544
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.022497230190967558,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.022497230190967558
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574885,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574885
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21895424836601307,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.21895424836601307,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19935691318327975,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.19935691318327975,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.024477222856135107,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.024477222856135107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872405,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24771838331160365,
"acc_stderr": 0.011025499291443737,
"acc_norm": 0.24771838331160365,
"acc_norm_stderr": 0.011025499291443737
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.15808823529411764,
"acc_stderr": 0.02216146260806851,
"acc_norm": 0.15808823529411764,
"acc_norm_stderr": 0.02216146260806851
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2434640522875817,
"acc_stderr": 0.017362473762146634,
"acc_norm": 0.2434640522875817,
"acc_norm_stderr": 0.017362473762146634
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782834,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.025206963154225395,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.025206963154225395
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.028996909693328927,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.028996909693328927
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064537,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064537
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826844,
"mc2": 0.47187304526757556,
"mc2_stderr": 0.01576900200671721
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076903
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
nurank/hdb-guanaco-llama2-1k | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 755333
num_examples: 5557
- name: test
num_bytes: 189508
num_examples: 1390
download_size: 230940
dataset_size: 944841
---
|
tcvieira/mmarco-pt-1k | ---
license: mit
---
|
Amo/Gothic-1-multilingual-dialogue | ---
language:
- en
- de
- pl
- ru
size_categories:
- 1B<n<4B
---
UPLOADED FOR EDUCATIONAL PURPOSE
Audio dataset created from Gothic 1 video game files, from the laguage version of German, English, Polish and Russian (Snowball version) dubbings.
It contains bot the wav audio files of seperated dialogue as well as the text transcription file per NPC.
example of data structure:<br><br>
g1_[language].zip<br>
-[folder-characterName-]bob(folder)<br>
--Audio1.wav<br>
--Audio2.wav<br>
-[characterName-]steve(folder)<br>
--Audio3.wav<br>
--Audio4.wav<br>
-[characterName]bob.txt<br>
-[characterName]steve.txt<br><br><br>
<br>Here is process of extracting the audio files from Gothic game using the modding tools as well as the Python script for formatting the text transcription dialogue<br>
https://youtu.be/2iV-FWuiVFo |
aisc-team-d1/healthsearchqa | ---
dataset_info:
features:
- name: id
dtype: float64
- name: question
dtype: string
splits:
- name: train
num_bytes: 170966
num_examples: 4436
download_size: 79303
dataset_size: 170966
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: unknown
task_categories:
- question-answering
language:
- en
tags:
- medical
size_categories:
- 1K<n<10K
---
# HealthSearchQA
Dataset of consumer health questions released by Google for the Med-PaLM paper ([arXiv preprint](https://arxiv.org/abs/2212.13138)).
From the [paper](https://www.nature.com/articles/s41586-023-06291-2):
We curated our own additional dataset consisting of 3,173 commonly searched consumer questions,
referred to as HealthSearchQA. The dataset was curated using seed medical conditions and their
associated symptoms. We used the seed data to retrieve publicly-available commonly searched questions
generated by a search engine, which were displayed to all users entering the seed terms. We publish the
dataset as an open benchmark for answering medical questions from consumers and hope this will be a useful
resource for the community, as a dataset reflecting real-world consumer concerns.
**Format:** Question only, free text response, open domain.
**Size:** 3,173.
**Example question:** How serious is atrial fibrillation?
**Example question:** What kind of cough comes with Covid?
**Example question:** Is blood in phlegm serious?
|
LinaAlhuri/ArabicImageNet | ---
task_categories:
- zero-shot-classification
- image-classification
language:
- ar
pretty_name: ArabicImageNet
size_categories:
- n<1K
---
# Arabic Translated ImageNet Dataset
---
## Overview
Welcome to the Arabic Translated ImageNet Dataset! This dataset is a version of the ImageNet dataset, translated into Arabic. ImageNet is a widely used benchmark for image classification tasks, and this translation aims to facilitate research and development in the Arabic language.
## Contents
1. **Arabic_query_short:** Arabic translations of the original ImageNet class labels.
2. **Arabic_query_long:** Arabic translations of the original ImageNet class labels within a simple template.
## Usage
- **Image Classification:** Use this dataset for training and evaluating models in the domain of image classification with a focus on the Arabic language.
## Dataset Translation and Considerations
This dataset has been translated using the Google Translation API. It's important to note that automated translation methods, including machine translation, may introduce biases and inaccuracies. The translations are generated algorithmically, and some categories may not have accurate or complete Arabic translations. Researchers and users are advised to be mindful of these limitations and consider the implications of incomplete or potentially inaccurate translations in their analyses.
|
AdapterOcean/med_alpaca_standardized_cluster_3_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 11107307
num_examples: 19284
download_size: 5486263
dataset_size: 11107307
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_3_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/eina_tulle_isitwrongtotrytopickupgirlsinadungeon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of eina_tulle (Dungeon ni Deai wo Motomeru no wa Machigatteiru no Darou ka)
This is the dataset of eina_tulle (Dungeon ni Deai wo Motomeru no wa Machigatteiru no Darou ka), containing 51 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
jp1924/KrespSpeech | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: id
dtype: string
- name: dataSet
struct:
- name: version
dtype: string
- name: date
dtype: string
- name: typeInfo
struct:
- name: category
dtype: string
- name: subcategory
dtype: string
- name: place
dtype: string
- name: speakers
list:
- name: id
dtype: string
- name: gender
dtype: string
- name: type
dtype: string
- name: age
dtype: string
- name: residence
dtype: string
- name: inputType
dtype: string
- name: dialogs
list:
- name: speaker
dtype: string
- name: audioPath
dtype: string
- name: textPath
dtype: string
splits:
- name: train
num_bytes: 335639155312.5
num_examples: 2067668
- name: validation
num_bytes: 3382855559.25
num_examples: 20830
download_size: 324002692624
dataset_size: 339022010871.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
task_categories:
- automatic-speech-recognition
language:
- ko
tags:
- STT
- Audio
size_categories:
- 100B<n<1T
---
# 고객 응대 음성
[AIHub](https://aihub.or.kr/aihubdata/data/view.do?currMenu=115&topMenu=100&aihubDataSe=realm&dataSetSn=87)
[builder_code](https://github.com/jp1924/HF_builders/tree/main) |
OdiaGenAI/odia_domain_knowledge_503 | ---
license: cc-by-nc-4.0
---
|
euclaise/WritingPrompts_preferences | ---
language:
- en
license: mit
size_categories:
- 100K<n<1M
task_categories:
- text-generation
pretty_name: WritingPrompts Preferences
dataset_info:
features:
- name: post_text
dtype: string
- name: post_title
dtype: string
- name: post_scores
dtype: int64
- name: comment_texts
sequence: string
- name: comment_scores
sequence: int64
- name: comment_times
sequence: string
splits:
- name: train
num_bytes: 2340246558
num_examples: 265174
download_size: 1357734208
dataset_size: 2340246558
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "WritingPrompts_preferences"
Human preference data from r/WritingPrompts |
coref-data/superglue_wsc_raw | ---
license: unknown
configs:
- config_name: wsc.fixed
data_files:
- split: train
path: "wsc.fixed/train-*.parquet"
- split: validation
path: "wsc.fixed/validation-*.parquet"
- split: test
path: "wsc.fixed/test-*.parquet"
- config_name: wsc
data_files:
- split: train
path: "wsc/train-*.parquet"
- split: validation
path: "wsc/validation-*.parquet"
- split: test
path: "wsc/test-*.parquet"
---
# Winograd Schema Challenge examples included in the SuperGLUE Benchmark
Specifically: The wsc and wsc.fixed datasets from the HuggingFace "super_glue" repository.
### Data Fields
- **`text`** (*`str`*): The text of the schema.
- **`span1_index`** (*`int`*): Starting word index of first entity.
- **`span2_index`** (*`int`*): Starting word index of second entity.
- **`span1_text`** (*`str`*): Textual representation of first entity.
- **`span2_text`** (*`str`*): Textual representation of second entity.
- **`idx`** (*`int`*): Index of the example in the dataset.
- **`label`** (*`bool`*): True if the two spans corefer.
"""The primary SuperGLUE tasks are built on and derived from existing datasets. We refer users to the original licenses accompanying each dataset, but it is our understanding that these licenses allow for their use and redistribution in a research context."""
```
@inproceedings{NEURIPS2019_4496bf24,
author = {Wang, Alex and Pruksachatkun, Yada and Nangia, Nikita and Singh, Amanpreet and Michael, Julian and Hill, Felix and Levy, Omer and Bowman, Samuel},
booktitle = {Advances in Neural Information Processing Systems},
editor = {H. Wallach and H. Larochelle and A. Beygelzimer and F. d\textquotesingle Alch\'{e}-Buc and E. Fox and R. Garnett},
pages = {},
publisher = {Curran Associates, Inc.},
title = {SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems},
url = {https://proceedings.neurips.cc/paper_files/paper/2019/file/4496bf24afe7fab6f046bf4923da8de6-Paper.pdf},
volume = {32},
year = {2019}
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@lewtun](https://github.com/lewtun), [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset. |
SicariusSicariiStuff/MATH_SUBSTRACT | ---
license: apache-2.0
---
|
nbtpj/movielens-1m-ratings | ---
dataset_info:
features:
- name: bucketized_user_age
dtype: float32
- name: movie_genres
sequence: int64
- name: movie_id
dtype: binary
- name: movie_title
dtype: binary
- name: timestamp
dtype: int64
- name: user_gender
dtype: bool
- name: user_id
dtype: binary
- name: user_occupation_label
dtype: int64
- name: user_occupation_text
dtype: binary
- name: user_rating
dtype: float32
- name: user_zip_code
dtype: binary
splits:
- name: train
num_bytes: 116192936
num_examples: 1000209
download_size: 43879407
dataset_size: 116192936
---
# Dataset Card for "movielens-1m-ratings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DamianUser/dataset | ---
license: mit
---
|
seansullivan/INT-BIZ | ---
license: other
license_name: int
license_link: LICENSE
---
|
open-llm-leaderboard/details_automerger__YamshadowExperiment28-7B | ---
pretty_name: Evaluation run of automerger/YamshadowExperiment28-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [automerger/YamshadowExperiment28-7B](https://huggingface.co/automerger/YamshadowExperiment28-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_automerger__YamshadowExperiment28-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T18:07:32.887630](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__YamshadowExperiment28-7B/blob/main/results_2024-04-02T18-07-32.887630.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6506362441525892,\n\
\ \"acc_stderr\": 0.03214511282866611,\n \"acc_norm\": 0.6494902958481049,\n\
\ \"acc_norm_stderr\": 0.03282602098787089,\n \"mc1\": 0.6328029375764994,\n\
\ \"mc1_stderr\": 0.016874805001453184,\n \"mc2\": 0.7852511937649902,\n\
\ \"mc2_stderr\": 0.013564195861737502\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7209897610921502,\n \"acc_stderr\": 0.013106784883601336,\n\
\ \"acc_norm\": 0.7329351535836177,\n \"acc_norm_stderr\": 0.01292893319649636\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7157936666002789,\n\
\ \"acc_stderr\": 0.004501137895230726,\n \"acc_norm\": 0.8924517028480382,\n\
\ \"acc_norm_stderr\": 0.0030917590945195384\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926924,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926924\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134128,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n\
\ \"acc_stderr\": 0.016607021781050873,\n \"acc_norm\": 0.441340782122905,\n\
\ \"acc_norm_stderr\": 0.016607021781050873\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533127,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533127\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6328029375764994,\n\
\ \"mc1_stderr\": 0.016874805001453184,\n \"mc2\": 0.7852511937649902,\n\
\ \"mc2_stderr\": 0.013564195861737502\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8524072612470402,\n \"acc_stderr\": 0.00996871576547965\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7050796057619408,\n \
\ \"acc_stderr\": 0.012560698010954769\n }\n}\n```"
repo_url: https://huggingface.co/automerger/YamshadowExperiment28-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|arc:challenge|25_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|gsm8k|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hellaswag|10_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-07-32.887630.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T18-07-32.887630.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- '**/details_harness|winogrande|5_2024-04-02T18-07-32.887630.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T18-07-32.887630.parquet'
- config_name: results
data_files:
- split: 2024_04_02T18_07_32.887630
path:
- results_2024-04-02T18-07-32.887630.parquet
- split: latest
path:
- results_2024-04-02T18-07-32.887630.parquet
---
# Dataset Card for Evaluation run of automerger/YamshadowExperiment28-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [automerger/YamshadowExperiment28-7B](https://huggingface.co/automerger/YamshadowExperiment28-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_automerger__YamshadowExperiment28-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T18:07:32.887630](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__YamshadowExperiment28-7B/blob/main/results_2024-04-02T18-07-32.887630.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6506362441525892,
"acc_stderr": 0.03214511282866611,
"acc_norm": 0.6494902958481049,
"acc_norm_stderr": 0.03282602098787089,
"mc1": 0.6328029375764994,
"mc1_stderr": 0.016874805001453184,
"mc2": 0.7852511937649902,
"mc2_stderr": 0.013564195861737502
},
"harness|arc:challenge|25": {
"acc": 0.7209897610921502,
"acc_stderr": 0.013106784883601336,
"acc_norm": 0.7329351535836177,
"acc_norm_stderr": 0.01292893319649636
},
"harness|hellaswag|10": {
"acc": 0.7157936666002789,
"acc_stderr": 0.004501137895230726,
"acc_norm": 0.8924517028480382,
"acc_norm_stderr": 0.0030917590945195384
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926924,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926924
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050873,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533127,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533127
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6328029375764994,
"mc1_stderr": 0.016874805001453184,
"mc2": 0.7852511937649902,
"mc2_stderr": 0.013564195861737502
},
"harness|winogrande|5": {
"acc": 0.8524072612470402,
"acc_stderr": 0.00996871576547965
},
"harness|gsm8k|5": {
"acc": 0.7050796057619408,
"acc_stderr": 0.012560698010954769
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yavuz789/npcdia | ---
license: mit
dataset_info:
features:
- name: example
struct:
- name: Instruction
dtype: string
- name: Response
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 14998
num_examples: 16
- name: test
num_bytes: 4227
num_examples: 5
download_size: 22905
dataset_size: 19225
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
AdapterOcean/python3-standardized_cluster_3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 57993204
num_examples: 5031
download_size: 0
dataset_size: 57993204
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mb23/cvt2_GS3_1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 1832280306.75
num_examples: 13258
- name: test
num_bytes: 640923801.75
num_examples: 4722
download_size: 2373732866
dataset_size: 2473204108.5
---
# Dataset Card for "cvt2_GS3_1"
* 適応的ヒストグラム平坦化を、GraySpectrogram3に適用
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jalFaizy/detect_chess_pieces | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
language:
- en
license:
- other
multilinguality:
- monolingual
pretty_name: Object Detection for Chess Pieces
size_categories:
- n<1K
source_datasets: []
task_categories:
- object-detection
task_ids: []
---
# Dataset Card for Object Detection for Chess Pieces
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://github.com/faizankshaikh/chessDetection
- **Repository:** https://github.com/faizankshaikh/chessDetection
- **Paper:** -
- **Leaderboard:** -
- **Point of Contact:** [Faizan Shaikh](mailto:faizankshaikh@gmail.com)
### Dataset Summary
The "Object Detection for Chess Pieces" dataset is a toy dataset created (as suggested by the name!) to introduce object detection in a beginner friendly way. It is structured in a one object-one image manner, with the objects being of four classes, namely, Black King, White King, Black Queen and White Queen
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train and evaluate simplistic object detection models
### Languages
The text (labels) in the dataset is in English
## Dataset Structure
### Data Instances
A data point comprises an image and the corresponding objects in bounding boxes.
```
{
'image': <PIL.PngImagePlugin.PngImageFile image mode=RGB size=224x224 at 0x23557C66160>,
'objects': { "label": [ 0 ], "bbox": [ [ 151, 151, 26, 26 ] ] }
}
```
### Data Fields
- `image`: A `PIL.Image.Image` object containing the 224x224 image.
- `label`: An integer between 0 and 3 representing the classes with the following mapping:
| Label | Description |
| --- | --- |
| 0 | blackKing |
| 1 | blackQueen |
| 2 | whiteKing |
| 3 | whiteQueen |
- `bbox`: A list of integers having sequence [x_center, y_center, width, height] for a particular bounding box
### Data Splits
The data is split into training and validation set. The training set contains 204 images and the validation set 52 images.
## Dataset Creation
### Curation Rationale
The dataset was created to be a simple benchmark for object detection
### Source Data
#### Initial Data Collection and Normalization
The data is obtained by machine generating images from "python-chess" library. Please refer [this code](https://github.com/faizankshaikh/chessDetection/blob/main/code/1.1%20create_images_with_labels.ipynb) to understand data generation pipeline
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
The annotations were done manually.
#### Who are the annotators?
The annotations were done manually.
### Personal and Sensitive Information
None
## Considerations for Using the Data
### Social Impact of Dataset
The dataset can be considered as a beginner-friendly toy dataset for object detection. It should not be used for benchmarking state of the art object detection models, or be used for a deployed model.
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
The dataset only contains four classes for simplicity. The complexity can be increased by considering all types of chess pieces, and by making it a multi-object detection problem
## Additional Information
### Dataset Curators
The dataset was created by Faizan Shaikh
### Licensing Information
The dataset is licensed as CC-BY-SA:2.0
### Citation Information
[Needs More Information] |
intertwine-expel/expel-blog | ---
pretty_name: Expel Blog Posts Dataset
---
# Expel Blog Posts
JSON formatted posts from the Expel Blog. |
dllllb/age-group-prediction | ---
task_categories:
- tabular-classification
tags:
- finance
pretty_name: Age group prediction 2019 competition by Sberbank
configs:
- config_name: transactions_train
data_files: transactions_train.csv.gz
- config_name: transactions_test
data_files: transactions_test.csv.gz
- config_name: test_ids
data_files: test.csv
- config_name: train_target
data_files: train_target.csv
---
https://ods.ai/competitions/sberbank-sirius-lesson |
agi-hackathon/Panda-70M-Original-Links | ---
license: apache-2.0
---
|
dim/AO3_fandom_chatbot_1to1 | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 1203600
num_examples: 614
download_size: 0
dataset_size: 1203600
---
# Dataset Card for "AO3_fandom_chatbot_1to1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Ba2han__HermesStar-OrcaWind-Synth-11B | ---
pretty_name: Evaluation run of Ba2han/HermesStar-OrcaWind-Synth-11B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Ba2han/HermesStar-OrcaWind-Synth-11B](https://huggingface.co/Ba2han/HermesStar-OrcaWind-Synth-11B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Ba2han__HermesStar-OrcaWind-Synth-11B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T17:07:13.650771](https://huggingface.co/datasets/open-llm-leaderboard/details_Ba2han__HermesStar-OrcaWind-Synth-11B/blob/main/results_2023-12-04T17-07-13.650771.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6532797554810081,\n\
\ \"acc_stderr\": 0.03195077848596749,\n \"acc_norm\": 0.6561799730695683,\n\
\ \"acc_norm_stderr\": 0.032588424826097764,\n \"mc1\": 0.31211750305997554,\n\
\ \"mc1_stderr\": 0.016220756769520932,\n \"mc2\": 0.48549947564682355,\n\
\ \"mc2_stderr\": 0.01511459123723551\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.01423587248790987,\n\
\ \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.013913034529620448\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.644991037641904,\n\
\ \"acc_stderr\": 0.004775380866948017,\n \"acc_norm\": 0.836885082652858,\n\
\ \"acc_norm_stderr\": 0.0036871539405687955\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n\
\ \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n\
\ \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.02350757902064536,\n \
\ \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.02350757902064536\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.029443169323031537,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.029443169323031537\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634285,\n \
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\"\
: 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n\
\ \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n\
\ \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8312236286919831,\n \"acc_stderr\": 0.024381406832586223,\n\
\ \"acc_norm\": 0.8312236286919831,\n \"acc_norm_stderr\": 0.024381406832586223\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.030360379710291947,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.030360379710291947\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3754189944134078,\n\
\ \"acc_stderr\": 0.01619510424846353,\n \"acc_norm\": 0.3754189944134078,\n\
\ \"acc_norm_stderr\": 0.01619510424846353\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.02777829870154544,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.02777829870154544\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31211750305997554,\n\
\ \"mc1_stderr\": 0.016220756769520932,\n \"mc2\": 0.48549947564682355,\n\
\ \"mc2_stderr\": 0.01511459123723551\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.011218629972515319\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5663381349507203,\n \
\ \"acc_stderr\": 0.013650728047064697\n }\n}\n```"
repo_url: https://huggingface.co/Ba2han/HermesStar-OrcaWind-Synth-11B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|arc:challenge|25_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|gsm8k|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hellaswag|10_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-07-13.650771.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T17-07-13.650771.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- '**/details_harness|winogrande|5_2023-12-04T17-07-13.650771.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T17-07-13.650771.parquet'
- config_name: results
data_files:
- split: 2023_12_04T17_07_13.650771
path:
- results_2023-12-04T17-07-13.650771.parquet
- split: latest
path:
- results_2023-12-04T17-07-13.650771.parquet
---
# Dataset Card for Evaluation run of Ba2han/HermesStar-OrcaWind-Synth-11B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Ba2han/HermesStar-OrcaWind-Synth-11B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Ba2han/HermesStar-OrcaWind-Synth-11B](https://huggingface.co/Ba2han/HermesStar-OrcaWind-Synth-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Ba2han__HermesStar-OrcaWind-Synth-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T17:07:13.650771](https://huggingface.co/datasets/open-llm-leaderboard/details_Ba2han__HermesStar-OrcaWind-Synth-11B/blob/main/results_2023-12-04T17-07-13.650771.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6532797554810081,
"acc_stderr": 0.03195077848596749,
"acc_norm": 0.6561799730695683,
"acc_norm_stderr": 0.032588424826097764,
"mc1": 0.31211750305997554,
"mc1_stderr": 0.016220756769520932,
"mc2": 0.48549947564682355,
"mc2_stderr": 0.01511459123723551
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.01423587248790987,
"acc_norm": 0.6527303754266212,
"acc_norm_stderr": 0.013913034529620448
},
"harness|hellaswag|10": {
"acc": 0.644991037641904,
"acc_stderr": 0.004775380866948017,
"acc_norm": 0.836885082652858,
"acc_norm_stderr": 0.0036871539405687955
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.02350757902064536,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.02350757902064536
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.029443169323031537,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.029443169323031537
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634285,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8312236286919831,
"acc_stderr": 0.024381406832586223,
"acc_norm": 0.8312236286919831,
"acc_norm_stderr": 0.024381406832586223
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291947,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291947
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3754189944134078,
"acc_stderr": 0.01619510424846353,
"acc_norm": 0.3754189944134078,
"acc_norm_stderr": 0.01619510424846353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.02777829870154544,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.02777829870154544
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31211750305997554,
"mc1_stderr": 0.016220756769520932,
"mc2": 0.48549947564682355,
"mc2_stderr": 0.01511459123723551
},
"harness|winogrande|5": {
"acc": 0.8011049723756906,
"acc_stderr": 0.011218629972515319
},
"harness|gsm8k|5": {
"acc": 0.5663381349507203,
"acc_stderr": 0.013650728047064697
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
adambjorn/UnrelatedForgettingOverhead | ---
license: openrail
dataset_info:
- config_name: creative
features:
- name: num_comments
dtype: int64
- name: downs
dtype: int64
- name: id
dtype: string
- name: score
dtype: int64
- name: selftext
dtype: string
- name: title
dtype: string
- name: ups
dtype: int64
splits:
- name: train
num_bytes: 232907.63636363635
num_examples: 200
- name: test
num_bytes: 87340.36363636363
num_examples: 75
download_size: 212494
dataset_size: 320248.0
- config_name: dialogsum
features:
- name: id
dtype: string
- name: dialogue
dtype: string
- name: summary
dtype: string
- name: topic
dtype: string
splits:
- name: train
num_bytes: 169441
num_examples: 200
- name: test
num_bytes: 61468
num_examples: 75
download_size: 128720
dataset_size: 230909
- config_name: glue-mrpc
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype:
class_label:
names:
'0': not_equivalent
'1': equivalent
- name: idx
dtype: int32
splits:
- name: train
num_bytes: 52320
num_examples: 200
- name: test
num_bytes: 19167
num_examples: 75
download_size: 58432
dataset_size: 71487
- config_name: ropes
features:
- name: id
dtype: string
- name: background
dtype: string
- name: situation
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
splits:
- name: test
num_bytes: 69611
num_examples: 75
download_size: 17863
dataset_size: 69611
- config_name: winograd_wsc
features:
- name: text
dtype: string
- name: pronoun
dtype: string
- name: pronoun_loc
dtype: int32
- name: quote
dtype: string
- name: quote_loc
dtype: int32
- name: options
sequence: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: source
dtype: string
splits:
- name: train
num_bytes: 37288
num_examples: 200
- name: test
num_bytes: 13852
num_examples: 75
download_size: 36050
dataset_size: 51140
configs:
- config_name: creative
data_files:
- split: train
path: creative/train-*
- split: test
path: creative/test-*
- config_name: dialogsum
data_files:
- split: train
path: dialogsum/train-*
- split: test
path: dialogsum/test-*
- config_name: glue-mrpc
data_files:
- split: train
path: glue-mrpc/train-*
- split: test
path: glue-mrpc/test-*
- config_name: ropes
data_files:
- split: test
path: ropes/test-*
- config_name: winograd_wsc
data_files:
- split: train
path: winograd_wsc/train-*
- split: test
path: winograd_wsc/test-*
---
|
cakiki/typescript_paths | ---
dataset_info:
features:
- name: repository_name
dtype: string
splits:
- name: train
num_bytes: 536493166
num_examples: 19441648
download_size: 434213958
dataset_size: 536493166
---
# Dataset Card for "typescript_paths"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_199 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1099790528
num_examples: 215984
download_size: 1119730663
dataset_size: 1099790528
---
# Dataset Card for "chunk_199"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/sims_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sims/シムス/西姆斯 (Azur Lane)
This is the dataset of sims/シムス/西姆斯 (Azur Lane), containing 100 images and their tags.
The core tags of this character are `animal_ears, cat_ears, long_hair, green_eyes, white_hair, animal_ear_fluff, bangs, parted_bangs, two_side_up, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 100 | 101.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sims_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 100 | 62.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sims_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 236 | 134.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sims_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 100 | 94.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sims_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 236 | 188.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sims_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sims_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, black_gloves, elbow_gloves, solo, white_background, white_dress, american_flag_print, black_thighhighs, blush, fingerless_gloves, looking_at_viewer, print_necktie, simple_background, sleeveless_dress, very_long_hair, open_mouth, :3, black_jacket, closed_mouth, collarbone, extra_ears, :d, bare_shoulders, boots, full_body, grey_footwear, standing |
| 1 | 14 |  |  |  |  |  | 1girl, bare_shoulders, solo, white_sailor_collar, blush, looking_at_viewer, sleeveless_shirt, white_background, black_gloves, black_shirt, fingerless_gloves, collarbone, black_hairband, red_collar, simple_background, very_long_hair, :3, belt_collar, closed_mouth, forehead, red_necktie, white_skirt, blue_eyes, fingernails, vest, heart, neckerchief, thighhighs, :d, garter_straps, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | elbow_gloves | solo | white_background | white_dress | american_flag_print | black_thighhighs | blush | fingerless_gloves | looking_at_viewer | print_necktie | simple_background | sleeveless_dress | very_long_hair | open_mouth | :3 | black_jacket | closed_mouth | collarbone | extra_ears | :d | bare_shoulders | boots | full_body | grey_footwear | standing | white_sailor_collar | sleeveless_shirt | black_shirt | black_hairband | red_collar | belt_collar | forehead | red_necktie | white_skirt | blue_eyes | fingernails | vest | heart | neckerchief | thighhighs | garter_straps |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:---------------|:-------|:-------------------|:--------------|:----------------------|:-------------------|:--------|:--------------------|:--------------------|:----------------|:--------------------|:-------------------|:-----------------|:-------------|:-----|:---------------|:---------------|:-------------|:-------------|:-----|:-----------------|:--------|:------------|:----------------|:-----------|:----------------------|:-------------------|:--------------|:-----------------|:-------------|:--------------|:-----------|:--------------|:--------------|:------------|:--------------|:-------|:--------|:--------------|:-------------|:----------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | | X | X | | | | X | X | X | | X | | X | X | X | | X | X | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
napsternxg/openrecipes-20170107-061401-recipeitems | ---
license: cc
language:
- en
pretty_name: open recipes
size_categories:
- 100K<n<1M
---
# Open Recipes
Open Recipes is an open database of recipe bookmarks
Data taken from https://github.com/fictivekin/openrecipes
Please credit the original creators.
Collected via:
```bash
curl -O https://s3.amazonaws.com/openrecipes/20170107-061401-recipeitems.json.gz
```
## License
The Open Recipes Database is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by/3.0/deed.en_US">Creative Commons Attribution 3.0 Unported License</a>.
<a rel="license" href="http://creativecommons.org/licenses/by/3.0/deed.en_US"><img alt="Creative Commons License" style="border-width:0" src="http://i.creativecommons.org/l/by/3.0/88x31.png" /></a>
© 2013 Fictive Kin LLC |
hle2000/Mintaka_Updated_Sequences_T5-xl-ssm | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: answerEntity
dtype: string
- name: questionEntity
dtype: string
- name: groundTruthAnswerEntity
dtype: string
- name: complexityType
dtype: string
- name: graph
dtype: string
- name: correct
dtype: bool
- name: g2t_sequence
dtype: string
- name: gap_sequence
dtype: string
- name: highlighted_g2t_sequence
dtype: string
- name: no_highlighted_g2t_sequence
dtype: string
- name: highlighted_gap_sequence
dtype: string
- name: no_highlighted_gap_sequence
dtype: string
- name: highlighted_determ_sequence
dtype: string
- name: no_highlighted_determ_sequence
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 178807874
num_examples: 61846
- name: test
num_bytes: 44949385
num_examples: 15583
download_size: 56499693
dataset_size: 223757259
---
# Dataset Card for "Mintaka_Updated_Sequences_T5-xl-ssm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Chamoda/guanaco-llama2-800 | ---
dataset_info:
features:
- name: Prompt
dtype: string
- name: Completion
dtype: string
- name: 'Virality '
dtype: int64
- name: 'Unnamed: 3'
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2872283
num_examples: 800
download_size: 1751882
dataset_size: 2872283
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-800"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/80000_sets_Multi_domain_Customer_Service_Dialogue_Text_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
Multi-domain Customer Service Dialogue Text Data, 90,000 sets in total; spanning multiple domains, including telecommunications, e-commerce, and financial, lifestyle, business, education, healthcare, and entertainment; Each set of data consists of single or multi-turn conversations; this dataset can be used for tasks such as LLM training, chatgpt
For more details, please refer to the link: https://www.nexdata.ai/dataset/1396?source=Huggingface
## Data content
Customer service dialogue text data in the telecommunications, e-commerce, and financial domains
## Data Size
90,000 sets, Over 1,826,837 entries
## Data collection method
The multi-turn dialogue texts between customers and customer service representatives, all texts have been manually proofread and supplemented to ensure the coherence and completeness of the text content; all data has undergone anonymization
## Storage Format
txt
## Language
Chinese
# Licensing Information
Commercial License
|
open-llm-leaderboard/details_TeeZee__Bielik-SOLAR-LIKE-10.7B-Instruct-v0.1 | ---
pretty_name: Evaluation run of TeeZee/Bielik-SOLAR-LIKE-10.7B-Instruct-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TeeZee/Bielik-SOLAR-LIKE-10.7B-Instruct-v0.1](https://huggingface.co/TeeZee/Bielik-SOLAR-LIKE-10.7B-Instruct-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__Bielik-SOLAR-LIKE-10.7B-Instruct-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-11T05:36:06.304240](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__Bielik-SOLAR-LIKE-10.7B-Instruct-v0.1/blob/main/results_2024-04-11T05-36-06.304240.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4948186418663002,\n\
\ \"acc_stderr\": 0.03469818080912493,\n \"acc_norm\": 0.4961736407637409,\n\
\ \"acc_norm_stderr\": 0.03541625383200884,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024647,\n \"mc2\": 0.4595937728683279,\n\
\ \"mc2_stderr\": 0.015890451958885037\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4522184300341297,\n \"acc_stderr\": 0.01454451988063383,\n\
\ \"acc_norm\": 0.4726962457337884,\n \"acc_norm_stderr\": 0.014589589101985993\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5290778729336786,\n\
\ \"acc_stderr\": 0.004981336318033645,\n \"acc_norm\": 0.6915952997410875,\n\
\ \"acc_norm_stderr\": 0.0046089078729576955\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.539622641509434,\n \"acc_stderr\": 0.03067609659938918,\n\
\ \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.03067609659938918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456344,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456344\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278006,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278006\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3544973544973545,\n \"acc_stderr\": 0.024636830602842,\n \"acc_norm\"\
: 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602842\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6225806451612903,\n \"acc_stderr\": 0.02757596072327822,\n \"\
acc_norm\": 0.6225806451612903,\n \"acc_norm_stderr\": 0.02757596072327822\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406796,\n \"\
acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406796\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6060606060606061,\n \"acc_stderr\": 0.03481285338232963,\n \"\
acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.03481285338232963\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6010362694300518,\n \"acc_stderr\": 0.03533999094065696,\n\
\ \"acc_norm\": 0.6010362694300518,\n \"acc_norm_stderr\": 0.03533999094065696\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4794871794871795,\n \"acc_stderr\": 0.025329663163489943,\n\
\ \"acc_norm\": 0.4794871794871795,\n \"acc_norm_stderr\": 0.025329663163489943\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.02752859921034049,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.02752859921034049\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.453781512605042,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.453781512605042,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6477064220183486,\n \"acc_stderr\": 0.020480568843998983,\n \"\
acc_norm\": 0.6477064220183486,\n \"acc_norm_stderr\": 0.020480568843998983\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6078431372549019,\n \"acc_stderr\": 0.03426712349247272,\n \"\
acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.03426712349247272\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6624472573839663,\n \"acc_stderr\": 0.03078154910202621,\n \
\ \"acc_norm\": 0.6624472573839663,\n \"acc_norm_stderr\": 0.03078154910202621\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n\
\ \"acc_stderr\": 0.033310925110381785,\n \"acc_norm\": 0.5605381165919282,\n\
\ \"acc_norm_stderr\": 0.033310925110381785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760626,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5950920245398773,\n \"acc_stderr\": 0.038566721635489125,\n\
\ \"acc_norm\": 0.5950920245398773,\n \"acc_norm_stderr\": 0.038566721635489125\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5922330097087378,\n \"acc_stderr\": 0.04865777570410768,\n\
\ \"acc_norm\": 0.5922330097087378,\n \"acc_norm_stderr\": 0.04865777570410768\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7564102564102564,\n\
\ \"acc_stderr\": 0.028120966503914414,\n \"acc_norm\": 0.7564102564102564,\n\
\ \"acc_norm_stderr\": 0.028120966503914414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6360153256704981,\n\
\ \"acc_stderr\": 0.017205684809032232,\n \"acc_norm\": 0.6360153256704981,\n\
\ \"acc_norm_stderr\": 0.017205684809032232\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5346820809248555,\n \"acc_stderr\": 0.026854257928258886,\n\
\ \"acc_norm\": 0.5346820809248555,\n \"acc_norm_stderr\": 0.026854257928258886\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n\
\ \"acc_stderr\": 0.014912413096372434,\n \"acc_norm\": 0.2737430167597765,\n\
\ \"acc_norm_stderr\": 0.014912413096372434\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.02843109544417664,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.02843109544417664\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5401929260450161,\n\
\ \"acc_stderr\": 0.028306190403305696,\n \"acc_norm\": 0.5401929260450161,\n\
\ \"acc_norm_stderr\": 0.028306190403305696\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5216049382716049,\n \"acc_stderr\": 0.027794760105008736,\n\
\ \"acc_norm\": 0.5216049382716049,\n \"acc_norm_stderr\": 0.027794760105008736\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02812163604063989,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02812163604063989\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37809647979139505,\n\
\ \"acc_stderr\": 0.012384878406798095,\n \"acc_norm\": 0.37809647979139505,\n\
\ \"acc_norm_stderr\": 0.012384878406798095\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4632352941176471,\n \"acc_stderr\": 0.030290619180485697,\n\
\ \"acc_norm\": 0.4632352941176471,\n \"acc_norm_stderr\": 0.030290619180485697\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4738562091503268,\n \"acc_stderr\": 0.020200164564804585,\n \
\ \"acc_norm\": 0.4738562091503268,\n \"acc_norm_stderr\": 0.020200164564804585\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5510204081632653,\n \"acc_stderr\": 0.03184213866687579,\n\
\ \"acc_norm\": 0.5510204081632653,\n \"acc_norm_stderr\": 0.03184213866687579\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n\
\ \"acc_stderr\": 0.03265819588512697,\n \"acc_norm\": 0.6915422885572139,\n\
\ \"acc_norm_stderr\": 0.03265819588512697\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6257309941520468,\n \"acc_stderr\": 0.037116011853894806,\n\
\ \"acc_norm\": 0.6257309941520468,\n \"acc_norm_stderr\": 0.037116011853894806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024647,\n \"mc2\": 0.4595937728683279,\n\
\ \"mc2_stderr\": 0.015890451958885037\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6779794790844514,\n \"acc_stderr\": 0.013132070202071076\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41470811220621684,\n \
\ \"acc_stderr\": 0.013570623842304506\n }\n}\n```"
repo_url: https://huggingface.co/TeeZee/Bielik-SOLAR-LIKE-10.7B-Instruct-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|arc:challenge|25_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|gsm8k|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hellaswag|10_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T05-36-06.304240.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-11T05-36-06.304240.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- '**/details_harness|winogrande|5_2024-04-11T05-36-06.304240.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-11T05-36-06.304240.parquet'
- config_name: results
data_files:
- split: 2024_04_11T05_36_06.304240
path:
- results_2024-04-11T05-36-06.304240.parquet
- split: latest
path:
- results_2024-04-11T05-36-06.304240.parquet
---
# Dataset Card for Evaluation run of TeeZee/Bielik-SOLAR-LIKE-10.7B-Instruct-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/Bielik-SOLAR-LIKE-10.7B-Instruct-v0.1](https://huggingface.co/TeeZee/Bielik-SOLAR-LIKE-10.7B-Instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__Bielik-SOLAR-LIKE-10.7B-Instruct-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-11T05:36:06.304240](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__Bielik-SOLAR-LIKE-10.7B-Instruct-v0.1/blob/main/results_2024-04-11T05-36-06.304240.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4948186418663002,
"acc_stderr": 0.03469818080912493,
"acc_norm": 0.4961736407637409,
"acc_norm_stderr": 0.03541625383200884,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024647,
"mc2": 0.4595937728683279,
"mc2_stderr": 0.015890451958885037
},
"harness|arc:challenge|25": {
"acc": 0.4522184300341297,
"acc_stderr": 0.01454451988063383,
"acc_norm": 0.4726962457337884,
"acc_norm_stderr": 0.014589589101985993
},
"harness|hellaswag|10": {
"acc": 0.5290778729336786,
"acc_stderr": 0.004981336318033645,
"acc_norm": 0.6915952997410875,
"acc_norm_stderr": 0.0046089078729576955
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.539622641509434,
"acc_stderr": 0.03067609659938918,
"acc_norm": 0.539622641509434,
"acc_norm_stderr": 0.03067609659938918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456344,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456344
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278006,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278006
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3544973544973545,
"acc_stderr": 0.024636830602842,
"acc_norm": 0.3544973544973545,
"acc_norm_stderr": 0.024636830602842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.02757596072327822,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.02757596072327822
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406796,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406796
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.03481285338232963,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.03481285338232963
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6010362694300518,
"acc_stderr": 0.03533999094065696,
"acc_norm": 0.6010362694300518,
"acc_norm_stderr": 0.03533999094065696
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4794871794871795,
"acc_stderr": 0.025329663163489943,
"acc_norm": 0.4794871794871795,
"acc_norm_stderr": 0.025329663163489943
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.02752859921034049,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.02752859921034049
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.453781512605042,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.453781512605042,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6477064220183486,
"acc_stderr": 0.020480568843998983,
"acc_norm": 0.6477064220183486,
"acc_norm_stderr": 0.020480568843998983
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.03426712349247272,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.03426712349247272
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6624472573839663,
"acc_stderr": 0.03078154910202621,
"acc_norm": 0.6624472573839663,
"acc_norm_stderr": 0.03078154910202621
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5605381165919282,
"acc_stderr": 0.033310925110381785,
"acc_norm": 0.5605381165919282,
"acc_norm_stderr": 0.033310925110381785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760626,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5950920245398773,
"acc_stderr": 0.038566721635489125,
"acc_norm": 0.5950920245398773,
"acc_norm_stderr": 0.038566721635489125
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.5922330097087378,
"acc_stderr": 0.04865777570410768,
"acc_norm": 0.5922330097087378,
"acc_norm_stderr": 0.04865777570410768
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7564102564102564,
"acc_stderr": 0.028120966503914414,
"acc_norm": 0.7564102564102564,
"acc_norm_stderr": 0.028120966503914414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6360153256704981,
"acc_stderr": 0.017205684809032232,
"acc_norm": 0.6360153256704981,
"acc_norm_stderr": 0.017205684809032232
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5346820809248555,
"acc_stderr": 0.026854257928258886,
"acc_norm": 0.5346820809248555,
"acc_norm_stderr": 0.026854257928258886
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372434,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372434
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.02843109544417664,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.02843109544417664
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5401929260450161,
"acc_stderr": 0.028306190403305696,
"acc_norm": 0.5401929260450161,
"acc_norm_stderr": 0.028306190403305696
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5216049382716049,
"acc_stderr": 0.027794760105008736,
"acc_norm": 0.5216049382716049,
"acc_norm_stderr": 0.027794760105008736
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02812163604063989,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02812163604063989
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37809647979139505,
"acc_stderr": 0.012384878406798095,
"acc_norm": 0.37809647979139505,
"acc_norm_stderr": 0.012384878406798095
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4632352941176471,
"acc_stderr": 0.030290619180485697,
"acc_norm": 0.4632352941176471,
"acc_norm_stderr": 0.030290619180485697
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4738562091503268,
"acc_stderr": 0.020200164564804585,
"acc_norm": 0.4738562091503268,
"acc_norm_stderr": 0.020200164564804585
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5510204081632653,
"acc_stderr": 0.03184213866687579,
"acc_norm": 0.5510204081632653,
"acc_norm_stderr": 0.03184213866687579
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.03265819588512697,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.03265819588512697
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6257309941520468,
"acc_stderr": 0.037116011853894806,
"acc_norm": 0.6257309941520468,
"acc_norm_stderr": 0.037116011853894806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024647,
"mc2": 0.4595937728683279,
"mc2_stderr": 0.015890451958885037
},
"harness|winogrande|5": {
"acc": 0.6779794790844514,
"acc_stderr": 0.013132070202071076
},
"harness|gsm8k|5": {
"acc": 0.41470811220621684,
"acc_stderr": 0.013570623842304506
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ctoraman/BilTweetNews-sentiment-analysis | ---
license: cc-by-nc-sa-4.0
language:
- tr
task_categories:
- text-classification
tags:
- sentiment analysis
- text classification
- tweets
- social media
- turkish
- sarcasm
- sarcastic
---
# Turkish Sentiment Analysis Tweet Dataset: BilTweetNews
The dataset contains tweets related to six major events from Turkish news sources between May 4, 2015
and Jan 8, 2017.
The dataset covers 6 major events:
- May 25, 2015 One of the popular football clubs in Turkey, Galatasaray, wins the 2015
Turkish Super League.
- Sep 6, 2015 A terrorist group, called PKK, attacked to soldiers in Dağlıca, a village in
southeastern Turkey.
- Oct 7, 2015 A Turkish scientist, Aziz Sancar, won the 2015 Nobel Chemistry prize with
his studies on DNA repair.
- May 27, 2016 A local football club of Alanya promoted to the Turkish Super League for
the first time in their history.
- Jun 17, 2016 A traditional anthem that is mostly played by secularists in Turkey, called
the 10th Year Anthem, was forbidden in schools by the director of national
education in the Black Sea province of Bolu.
- Oct 17, 2016 A magazine programmer confused that Madonna in a Fur Coat, a book written
in 1943 by a Turkish celebrated writer, Sabahattin Ali, was about popstar
Madonna’s life. The book tells a story between a Turkish student and German
singer after the World War I.
- Not related to any news topic
For each event, 100 related-candidate and 60 unrelated-candidate tweets are selected. Lastly, we randomly select 40 tweets that are potentially not related at all, 5 of them are
removed due to detecting near-duplicates later. The dataset has 995 tweets in total.
There are 4 sentiment classes:
- Positive
- Negative
- Neutral
- Sarcastic
All tweets are labeled by 17 annotators. We provide the normalized distribution of annotations across 4 sentiment classes. We also provide the majority sentiment class at the last column. If there are multiple classes with highest scores, then we set "Multi" as majority.
Github Repo: https://github.com/BilkentInformationRetrievalGroup/BilTweetNews2017
# If you would like to use any material in this repository, please cite the following papers:
- Toraman, C. Early Prediction of Public Reactions to News Events Using Microblogs. Seventh BCS-IRSG Symposium on Future Directions in Information Access (FDIA 2017), Barcelona, Spain, 5 September 2017.
- Toraman, C. Event-related microblog retrieval in Turkish. Turkish Journal of Electrical Engineering and Computer Sciences. 2021. DOI: 10.3906/elk-2108-167
**** |
voidful/NMSQA | ---
annotations_creators:
- crowdsourced
- machine-generated
language_creators:
- expert-generated
- machine-generated
- crowdsourced
language:
- en
license: []
multilinguality:
- monolingual
size_categories:
- unknown
source_datasets:
- original
task_categories:
- question-answering
- automatic-speech-recognition
task_ids:
- abstractive-qa
pretty_name: NMSQA
tags:
- speech-recognition
---
# Dataset Card for NMSQA(Natural Multi-speaker Spoken Question Answering)
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- Homepage:
https://github.com/DanielLin94144/DUAL-textless-SQA
- Repository:
https://github.com/DanielLin94144/DUAL-textless-SQA
- Paper:
https://arxiv.org/abs/2203.04911
- Leaderboard:
- Point of Contact:
Download audio data: [https://huggingface.co/datasets/voidful/NMSQA/resolve/main/nmsqa_audio.tar.gz](https://huggingface.co/datasets/voidful/NMSQA/resolve/main/nmsqa_audio.tar.gz)
Unzip audio data: `tar -xf nmsqa_audio.tar.gz`
### Dataset Summary
The Natural Multi-speaker Spoken Question Answering (NMSQA) dataset is designed for the task of textless spoken question answering. It is based on the SQuAD dataset and contains spoken questions and passages. The dataset includes the original text, transcriptions, and audio files of the spoken content. This dataset is created to evaluate the performance of models on textless spoken question answering tasks.
### Supported Tasks and Leaderboards
The primary task supported by this dataset is textless spoken question answering, where the goal is to answer questions based on spoken passages without relying on textual information. The dataset can also be used for automatic speech recognition tasks.
### Languages
The dataset is in English.
## Dataset Structure
### Data Instances
Each instance in the dataset contains the following fields:
- id: Unique identifier for the instance
- title: The title of the passage
- context: The passage text
- question: The question text
- - answer_start: The start index of the answer in the text
- audio_full_answer_end: The end position of the audio answer in seconds
- audio_full_answer_start: The start position of the audio answer in seconds
- audio_full_neg_answer_end: The end position of the audio answer in seconds for an incorrect answer with the same words
- audio_full_neg_answer_start: The start position of the audio answer in seconds for an incorrect answer with the same words
- audio_segment_answer_end: The end position of the audio answer in seconds for the segment
- audio_segment_answer_start: The start position of the audio answer in seconds for the segment
- text: The answer text
- content_segment_audio_path: The audio path for the content segment
- content_full_audio_path: The complete audio path for the content
- content_audio_sampling_rate: The audio sampling rate
- content_audio_speaker: The audio speaker
- content_segment_text: The segment text of the content
- content_segment_normalized_text: The normalized text for generating audio
- question_audio_path: The audio path for the question
- question_audio_sampling_rate: The audio sampling rate
- question_audio_speaker: The audio speaker
- question_normalized_text: The normalized text for generating audio
### Data Fields
The dataset includes the following data fields:
- id
- title
- context
- question
- answers
- content_segment_audio_path
- content_full_audio_path
- content_audio_sampling_rate
- content_audio_speaker
- content_segment_text
- content_segment_normalized_text
- question_audio_path
- question_audio_sampling_rate
- question_audio_speaker
- question_normalized_text
### Data Splits
The dataset is split into train, dev, and test sets.
## Dataset Creation
### Curation Rationale
The NMSQA dataset is created to address the challenge of textless spoken question answering, where the model must answer questions based on spoken passages without relying on textual information.
### Source Data
The NMSQA dataset is based on the SQuAD dataset, with spoken questions and passages created from the original text data.
#### Initial Data Collection and Normalization
The initial data collection involved converting the original SQuAD dataset's text-based questions and passages into spoken audio files. The text was first normalized, and then audio files were generated using text-to-speech methods.
#### Who are the source language producers?
The source language producers are the creators of the SQuAD dataset and the researchers who generated the spoken audio files for the NMSQA dataset.
### Annotations
#### Annotation process
The annotations for the NMSQA dataset are derived from the original SQuAD dataset. Additional annotations, such as audio start and end positions for correct and incorrect answers, as well as audio file paths and speaker information, are added by the dataset creators.
#### Who are the annotators?
The annotators for the NMSQA dataset are the creators of the SQuAD dataset and the researchers who generated the spoken audio files and additional annotations for the NMSQA dataset.
### Personal and Sensitive Information
The dataset does not contain any personal or sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
The NMSQA dataset contributes to the development and evaluation of models for textless spoken question answering tasks, which can lead to advancements in natural language processing and automatic speech recognition. Applications of these technologies can improve accessibility and convenience in various domains, such as virtual assistants, customer service, and voice-controlled devices.
### Discussion of Biases
The dataset inherits potential biases from the original SQuAD dataset, which may include biases in the selection of passages, questions, and answers. Additionally, biases may be introduced in the text-to-speech process and the choice of speakers used to generate the spoken audio files.
### Other Known Limitations
As the dataset is based on the SQuAD dataset, it shares the same limitations, including the fact that it is limited to the English language and mainly focuses on factual questions. Furthermore, the dataset may not cover a wide range of accents, dialects, or speaking styles.
## Additional Information
### Dataset Curators
The NMSQA dataset is curated by Guan-Ting Lin, Yung-Sung Chuang, Ho-Lam Chung, Shu-Wen Yang, Hsuan-Jui Chen, Shang-Wen Li, Abdelrahman Mohamed, Hung-Yi Lee, and Lin-Shan Lee.
### Licensing Information
The licensing information for the dataset is not explicitly mentioned.
### Citation Information
```css
@article{lin2022dual,
title={DUAL: Textless Spoken Question Answering with Speech Discrete Unit Adaptive Learning},
author={Lin, Guan-Ting and Chuang, Yung-Sung and Chung, Ho-Lam and Yang, Shu-wen and Chen, Hsuan-Jui and Li, Shang-Wen and Mohamed, Abdelrahman and Lee, Hung-yi and Lee, Lin-shan},
journal={arXiv preprint arXiv:2203.04911},
year={2022}
}
```
### Contributions
Thanks to [@voidful](https://github.com/voidful) for adding this dataset. |
communistrigger/Patrick_Star | ---
license: mit
---
|
Back-up/QA-En-General | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 34111625.1145805
num_examples: 20000
download_size: 19538064
dataset_size: 34111625.1145805
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "QA-En-General"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/helicopter_drawing_descriptions | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 176717
num_examples: 1000
download_size: 18746
dataset_size: 176717
---
# Dataset Card for "helicopter_drawing_descriptions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ehartford__Samantha-1.1-70b | ---
pretty_name: Evaluation run of ehartford/Samantha-1.1-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/Samantha-1.1-70b](https://huggingface.co/ehartford/Samantha-1.1-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__Samantha-1.1-70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T15:47:52.190208](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.1-70b/blob/main/results_2023-10-19T15-47-52.190208.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.5110109060402684,\n\
\ \"em_stderr\": 0.00511922622901773,\n \"f1\": 0.5558829697986593,\n\
\ \"f1_stderr\": 0.004885906429154466,\n \"acc\": 0.5744121045517131,\n\
\ \"acc_stderr\": 0.011649119740139782\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.5110109060402684,\n \"em_stderr\": 0.00511922622901773,\n\
\ \"f1\": 0.5558829697986593,\n \"f1_stderr\": 0.004885906429154466\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3161485974222896,\n \
\ \"acc_stderr\": 0.012807630673451488\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828079\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/Samantha-1.1-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T15_47_52.190208
path:
- '**/details_harness|drop|3_2023-10-19T15-47-52.190208.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T15-47-52.190208.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T15_47_52.190208
path:
- '**/details_harness|gsm8k|5_2023-10-19T15-47-52.190208.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T15-47-52.190208.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T15_47_52.190208
path:
- '**/details_harness|winogrande|5_2023-10-19T15-47-52.190208.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T15-47-52.190208.parquet'
- config_name: results
data_files:
- split: 2023_10_19T15_47_52.190208
path:
- results_2023-10-19T15-47-52.190208.parquet
- split: latest
path:
- results_2023-10-19T15-47-52.190208.parquet
---
# Dataset Card for Evaluation run of ehartford/Samantha-1.1-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/Samantha-1.1-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/Samantha-1.1-70b](https://huggingface.co/ehartford/Samantha-1.1-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__Samantha-1.1-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T15:47:52.190208](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.1-70b/blob/main/results_2023-10-19T15-47-52.190208.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.5110109060402684,
"em_stderr": 0.00511922622901773,
"f1": 0.5558829697986593,
"f1_stderr": 0.004885906429154466,
"acc": 0.5744121045517131,
"acc_stderr": 0.011649119740139782
},
"harness|drop|3": {
"em": 0.5110109060402684,
"em_stderr": 0.00511922622901773,
"f1": 0.5558829697986593,
"f1_stderr": 0.004885906429154466
},
"harness|gsm8k|5": {
"acc": 0.3161485974222896,
"acc_stderr": 0.012807630673451488
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828079
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.