datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_TeeZee__DarkSapling-7B-v1.1 | ---
pretty_name: Evaluation run of TeeZee/DarkSapling-7B-v1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TeeZee/DarkSapling-7B-v1.1](https://huggingface.co/TeeZee/DarkSapling-7B-v1.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__DarkSapling-7B-v1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T16:05:24.106495](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__DarkSapling-7B-v1.1/blob/main/results_2024-02-10T16-05-24.106495.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6433485007331476,\n\
\ \"acc_stderr\": 0.03224755088237272,\n \"acc_norm\": 0.6480356098242434,\n\
\ \"acc_norm_stderr\": 0.03288865628071413,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5203512584081402,\n\
\ \"mc2_stderr\": 0.015242875318998528\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946707,\n\
\ \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.0140702655192688\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6580362477594105,\n\
\ \"acc_stderr\": 0.004733980470799212,\n \"acc_norm\": 0.8509261103365864,\n\
\ \"acc_norm_stderr\": 0.003554333976897245\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695248,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695248\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812142,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812142\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465076,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465076\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n\
\ \"acc_stderr\": 0.016399436366612927,\n \"acc_norm\": 0.8220183486238533,\n\
\ \"acc_norm_stderr\": 0.016399436366612927\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n\
\ \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588674,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588674\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265012,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265012\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n\
\ \"acc_stderr\": 0.01596103667523096,\n \"acc_norm\": 0.35083798882681566,\n\
\ \"acc_norm_stderr\": 0.01596103667523096\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303055,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303055\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706214,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706214\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5203512584081402,\n\
\ \"mc2_stderr\": 0.015242875318998528\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345398\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4518574677786202,\n \
\ \"acc_stderr\": 0.01370849499567764\n }\n}\n```"
repo_url: https://huggingface.co/TeeZee/DarkSapling-7B-v1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|arc:challenge|25_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|gsm8k|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hellaswag|10_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T16-05-24.106495.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T16-05-24.106495.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- '**/details_harness|winogrande|5_2024-02-10T16-05-24.106495.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T16-05-24.106495.parquet'
- config_name: results
data_files:
- split: 2024_02_10T16_05_24.106495
path:
- results_2024-02-10T16-05-24.106495.parquet
- split: latest
path:
- results_2024-02-10T16-05-24.106495.parquet
---
# Dataset Card for Evaluation run of TeeZee/DarkSapling-7B-v1.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/DarkSapling-7B-v1.1](https://huggingface.co/TeeZee/DarkSapling-7B-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__DarkSapling-7B-v1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T16:05:24.106495](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__DarkSapling-7B-v1.1/blob/main/results_2024-02-10T16-05-24.106495.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6433485007331476,
"acc_stderr": 0.03224755088237272,
"acc_norm": 0.6480356098242434,
"acc_norm_stderr": 0.03288865628071413,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5203512584081402,
"mc2_stderr": 0.015242875318998528
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946707,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.0140702655192688
},
"harness|hellaswag|10": {
"acc": 0.6580362477594105,
"acc_stderr": 0.004733980470799212,
"acc_norm": 0.8509261103365864,
"acc_norm_stderr": 0.003554333976897245
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812142,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812142
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465076,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465076
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612927,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612927
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588674,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588674
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265012,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265012
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.01596103667523096,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.01596103667523096
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706214,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706214
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5203512584081402,
"mc2_stderr": 0.015242875318998528
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345398
},
"harness|gsm8k|5": {
"acc": 0.4518574677786202,
"acc_stderr": 0.01370849499567764
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ppxscal/arxiv-metadata-oai-snapshot-t_a-tokenized | ---
dataset_info:
features:
- name: id
dtype: string
- name: submitter
dtype: string
- name: authors
dtype: string
- name: title
dtype: string
- name: comments
dtype: string
- name: journal-ref
dtype: string
- name: doi
dtype: string
- name: report-no
dtype: string
- name: categories
dtype: string
- name: license
dtype: string
- name: abstract
dtype: string
- name: versions
list:
- name: created
dtype: string
- name: version
dtype: string
- name: update_date
dtype: string
- name: authors_parsed
sequence:
sequence: string
- name: title_tokens
sequence: int64
- name: abstract_tokens
sequence: int64
- name: title_attention_mask
sequence: int64
- name: abstract_attention_mask
sequence: int64
splits:
- name: train
num_bytes: 41515729836
num_examples: 2318918
download_size: 2981082766
dataset_size: 41515729836
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "arxiv-metadata-oai-snapshot-t_a-tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)\
tokenized with Shitao/RetroMAE |
DopeorNope/new_instruct7 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 460598088
num_examples: 133946
download_size: 232259663
dataset_size: 460598088
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
freshpearYoon/train_free_44 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604766128
num_examples: 10000
download_size: 1468311141
dataset_size: 9604766128
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
qazisaad/llama_2_optimized_product_titles-esci-part1 | ---
dataset_info:
features:
- name: level_0
dtype: int64
- name: index
dtype: int64
- name: product_title
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: preds
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 5347864
num_examples: 1680
download_size: 1028985
dataset_size: 5347864
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_optimized_product_titles-esci-part1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fursov/gec_ner_val | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: test
num_bytes: 21496736.708623063
num_examples: 55538
- name: validation
num_bytes: 1548254.2913769358
num_examples: 4000
download_size: 4069267
dataset_size: 23044991.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
varun-v-rao/mimic-cxr-dpo-with-metrics | ---
license: mit
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: rougeL
dtype: float64
- name: F1RadGraph
dtype: float64
- name: F1CheXbert
dtype: float32
splits:
- name: train
num_bytes: 77989978
num_examples: 125417
download_size: 26646031
dataset_size: 77989978
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
niv-al/sq-babi_nli_size-reasoning | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: labels
dtype:
class_label:
names:
'0': not-entailed
'1': entailed
splits:
- name: train
num_bytes: 276956
num_examples: 1000
- name: validation
num_bytes: 38395
num_examples: 144
- name: test
num_bytes: 38898
num_examples: 144
download_size: 32189
dataset_size: 354249
language:
- sq
---
# Dataset Card for "sq-babi_nli_size-reasoning"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shoeb1lly/RVC-Model-ErebusV2 | ---
license: cc-by-4.0
---
|
johannes-garstenauer/embeddings_from_distilbert_masking_heaps_and_eval_part1 | ---
dataset_info:
features:
- name: struct
dtype: string
- name: label
dtype: int64
- name: pred
dtype: int64
- name: cls_layer_6
sequence: float32
- name: cls_layer_5
sequence: float32
- name: cls_layer_4
sequence: float32
splits:
- name: train
num_bytes: 1281395185
num_examples: 134495
download_size: 1491732485
dataset_size: 1281395185
---
# Dataset Card for "embeddings_from_distilbert_masking_heaps_and_eval_part1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deepapaikar/SC_katzbot | ---
license: apache-2.0
---
|
meowmeownig/meowdels | ---
license: creativeml-openrail-m
---
|
arubenruben/mini_harem_selective_ours | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PESSOA
'2': I-PESSOA
'3': B-ORGANIZACAO
'4': I-ORGANIZACAO
'5': B-LOCAL
'6': I-LOCAL
'7': B-TEMPO
'8': I-TEMPO
'9': B-VALOR
'10': I-VALOR
splits:
- name: validation
num_bytes: 1031284
num_examples: 178
download_size: 220176
dataset_size: 1031284
---
# Dataset Card for "mini_harem_selective_ours"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gsstein/results-llama-1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: base_100_x
dtype: string
- name: opt_100
dtype: string
- name: generated_opt_100
dtype: bool
- name: opt_75
dtype: string
- name: generated_opt_75
dtype: bool
- name: opt_50
dtype: string
- name: generated_opt_50
dtype: bool
- name: opt_25
dtype: string
- name: generated_opt_25
dtype: bool
- name: opt_0
dtype: string
- name: generated_opt_0
dtype: bool
- name: llama_100
dtype: string
- name: llama_75
dtype: string
- name: base_100_y
dtype: string
- name: base_75
dtype: string
- name: base_50
dtype: string
- name: base_25
dtype: string
- name: base_0
dtype: string
splits:
- name: train
num_bytes: 23606178
num_examples: 15326
- name: test
num_bytes: 885267
num_examples: 576
- name: validation
num_bytes: 876520
num_examples: 576
download_size: 16194564
dataset_size: 25367965
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
philschmid/llama2-german-corpus-tokenized-llama-chunk-4096 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1190392538880
num_examples: 20753008
download_size: 307400657843
dataset_size: 1190392538880
---
# Dataset Card for "llama2-german-corpus-tokenized-llama-chunk-4096"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adalib/starcoder-apis-1 | ---
dataset_info:
features:
- name: code
dtype: string
- name: apis
sequence: string
- name: extract_api
dtype: string
splits:
- name: train
num_bytes: 12974165655
num_examples: 1591637
download_size: 4523271352
dataset_size: 12974165655
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sablo/HelpSteer_binarized | ---
language:
- en
license: cc-by-4.0
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_rejected
dtype: float64
splits:
- name: train
num_bytes: 69199364
num_examples: 8130
- name: test
num_bytes: 3597313
num_examples: 418
download_size: 42251007
dataset_size: 72796677
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
tags:
- human-feedback
---
# Binarized version of HelpSteer
### Dataset Description
A binarized version of https://huggingface.co/datasets/nvidia/HelpSteer ready for DPO using https://github.com/huggingface/alignment-handbook or similar.
For each unique prompt, we take the best and worst scoring (average of helpfulness and correctness) responses. These are converted into MessagesList format in the 'chosen' and 'rejected' columns.
- **Created by:** [dctanner](https://huggingface.co/dctanner) and the team at [Sablo AI](https://sablo.ai)
- **License:** CC BY 4.0 |
CyberHarem/toyokawa_fuuka_theidolmstermillionlive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of toyokawa_fuuka/豊川風花/토요카와후카 (THE iDOLM@STER: Million Live!)
This is the dataset of toyokawa_fuuka/豊川風花/토요카와후카 (THE iDOLM@STER: Million Live!), containing 500 images and their tags.
The core tags of this character are `blue_hair, short_hair, breasts, brown_eyes, antenna_hair, large_breasts, bangs, wavy_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 559.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/toyokawa_fuuka_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 343.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/toyokawa_fuuka_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1173 | 728.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/toyokawa_fuuka_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 504.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/toyokawa_fuuka_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1173 | 999.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/toyokawa_fuuka_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/toyokawa_fuuka_theidolmstermillionlive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, bikini, cleavage, navel, open_mouth, simple_background, white_background, smile |
| 1 | 8 |  |  |  |  |  | 1girl, day, looking_at_viewer, navel, ocean, outdoors, smile, solo, blue_sky, cloud, cleavage, cowboy_shot, open_mouth, beach, collarbone, blue_bikini, blush, covered_nipples, halterneck |
| 2 | 5 |  |  |  |  |  | 1girl, :d, looking_at_viewer, open_mouth, solo, dress, polka_dot, character_name, character_signature, hat |
| 3 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, nipples, solo, blush, navel, open_mouth, smile, female_pubic_hair, collarbone, completely_nude |
| 4 | 7 |  |  |  |  |  | 1girl, female_pubic_hair, nipples, navel, pussy, spread_legs, sweat, looking_at_viewer, 1boy, anus, blush, hetero, mosaic_censoring, on_bed, completely_nude, on_back, open_mouth, pillow, sex, solo_focus, vaginal |
| 5 | 9 |  |  |  |  |  | 1boy, 1girl, blush, hetero, penis, solo_focus, nipples, paizuri, sweat, breasts_squeezed_together, mosaic_censoring, open_mouth, collarbone, completely_nude, cum_in_mouth, fellatio |
| 6 | 8 |  |  |  |  |  | smile, 1girl, blush, hair_flower, pearl_necklace, solo, earrings, looking_at_viewer, medium_breasts, purple_dress, bare_shoulders, collarbone, strapless_dress, black_gloves, character_name, hair_between_eyes, open_mouth, pink_flower, rose, sparkle, upper_body |
| 7 | 9 |  |  |  |  |  | 1girl, long_sleeves, solo, white_shirt, blush, pleated_skirt, ponytail, serafuku, white_background, looking_at_viewer, simple_background, yellow_neckerchief, blue_skirt, white_sailor_collar, collarbone, smile |
| 8 | 10 |  |  |  |  |  | 1girl, playboy_bunny, rabbit_ears, cleavage, detached_collar, fake_animal_ears, solo, bare_shoulders, looking_at_viewer, strapless_leotard, wrist_cuffs, blush, rabbit_tail, black_pantyhose, bowtie, cowboy_shot, black_leotard, fake_tail, fishnet_pantyhose, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | looking_at_viewer | bikini | cleavage | navel | open_mouth | simple_background | white_background | smile | day | ocean | outdoors | blue_sky | cloud | cowboy_shot | beach | collarbone | blue_bikini | covered_nipples | halterneck | :d | dress | polka_dot | character_name | character_signature | hat | nipples | female_pubic_hair | completely_nude | pussy | spread_legs | sweat | 1boy | anus | hetero | mosaic_censoring | on_bed | on_back | pillow | sex | solo_focus | vaginal | penis | paizuri | breasts_squeezed_together | cum_in_mouth | fellatio | hair_flower | pearl_necklace | earrings | medium_breasts | purple_dress | bare_shoulders | strapless_dress | black_gloves | hair_between_eyes | pink_flower | rose | sparkle | upper_body | long_sleeves | white_shirt | pleated_skirt | ponytail | serafuku | yellow_neckerchief | blue_skirt | white_sailor_collar | playboy_bunny | rabbit_ears | detached_collar | fake_animal_ears | strapless_leotard | wrist_cuffs | rabbit_tail | black_pantyhose | bowtie | black_leotard | fake_tail | fishnet_pantyhose |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:---------|:-----------|:--------|:-------------|:--------------------|:-------------------|:--------|:------|:--------|:-----------|:-----------|:--------|:--------------|:--------|:-------------|:--------------|:------------------|:-------------|:-----|:--------|:------------|:-----------------|:----------------------|:------|:----------|:--------------------|:------------------|:--------|:--------------|:--------|:-------|:-------|:---------|:-------------------|:---------|:----------|:---------|:------|:-------------|:----------|:--------|:----------|:----------------------------|:---------------|:-----------|:--------------|:-----------------|:-----------|:-----------------|:---------------|:-----------------|:------------------|:---------------|:--------------------|:--------------|:-------|:----------|:-------------|:---------------|:--------------|:----------------|:-----------|:-----------|:---------------------|:-------------|:----------------------|:----------------|:--------------|:------------------|:-------------------|:--------------------|:--------------|:--------------|:------------------|:---------|:----------------|:------------|:--------------------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | X | X | X | | | X | X | | | X | | | | | | | | X | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | X | | | | | X | | | | | | | | | | | X | | | | | | | | | | X | | X | | | X | X | | X | X | | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | X | X | X | | | | X | | | X | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | X | X | X | | | | | X | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 8 | 10 |  |  |  |  |  | X | X | X | X | | X | | | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
maxolotl/must-c-en-fr-wait03_22.21 | ---
dataset_info:
features:
- name: current_source
dtype: string
- name: current_target
dtype: string
- name: target_token
dtype: string
splits:
- name: train
num_bytes: 1071696759
num_examples: 5530635
- name: test
num_bytes: 11897959
num_examples: 64317
- name: validation
num_bytes: 5584999
num_examples: 29172
download_size: 189892905
dataset_size: 1089179717
---
# Dataset Card for "must-c-en-fr-wait03_22.21"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4 | ---
pretty_name: Evaluation run of wei123602/Llama-2-13b-FINETUNE4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wei123602/Llama-2-13b-FINETUNE4](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T06:23:21.987505](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4/blob/main/results_2023-10-23T06-23-21.987505.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08525587248322147,\n\
\ \"em_stderr\": 0.0028599050719363664,\n \"f1\": 0.13560297818791875,\n\
\ \"f1_stderr\": 0.0029877199841954003,\n \"acc\": 0.44731455091723,\n\
\ \"acc_stderr\": 0.010474236802343157\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08525587248322147,\n \"em_stderr\": 0.0028599050719363664,\n\
\ \"f1\": 0.13560297818791875,\n \"f1_stderr\": 0.0029877199841954003\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12509476876421532,\n \
\ \"acc_stderr\": 0.009112601439849643\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836671\n\
\ }\n}\n```"
repo_url: https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T06_23_21.987505
path:
- '**/details_harness|drop|3_2023-10-23T06-23-21.987505.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T06-23-21.987505.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T06_23_21.987505
path:
- '**/details_harness|gsm8k|5_2023-10-23T06-23-21.987505.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T06-23-21.987505.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-14-12.416583.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-14-12.416583.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T13-14-12.416583.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T06_23_21.987505
path:
- '**/details_harness|winogrande|5_2023-10-23T06-23-21.987505.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T06-23-21.987505.parquet'
- config_name: results
data_files:
- split: 2023_09_18T13_14_12.416583
path:
- results_2023-09-18T13-14-12.416583.parquet
- split: 2023_10_23T06_23_21.987505
path:
- results_2023-10-23T06-23-21.987505.parquet
- split: latest
path:
- results_2023-10-23T06-23-21.987505.parquet
---
# Dataset Card for Evaluation run of wei123602/Llama-2-13b-FINETUNE4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wei123602/Llama-2-13b-FINETUNE4](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T06:23:21.987505](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4/blob/main/results_2023-10-23T06-23-21.987505.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08525587248322147,
"em_stderr": 0.0028599050719363664,
"f1": 0.13560297818791875,
"f1_stderr": 0.0029877199841954003,
"acc": 0.44731455091723,
"acc_stderr": 0.010474236802343157
},
"harness|drop|3": {
"em": 0.08525587248322147,
"em_stderr": 0.0028599050719363664,
"f1": 0.13560297818791875,
"f1_stderr": 0.0029877199841954003
},
"harness|gsm8k|5": {
"acc": 0.12509476876421532,
"acc_stderr": 0.009112601439849643
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836671
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Balassar/balassarprofile | ---
dataset_info:
features:
- name: data_input
dtype: string
splits:
- name: train
num_bytes: 5337.6
num_examples: 16
- name: test
num_bytes: 1334.4
num_examples: 4
download_size: 9527
dataset_size: 6672.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
quan246/MultiMed_Doc | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: translation
struct:
- name: en
dtype: string
- name: vi
dtype: string
splits:
- name: train
num_bytes: 351140
num_examples: 1000
- name: dev
num_bytes: 31689
num_examples: 100
- name: test
num_bytes: 464211
num_examples: 4230
download_size: 484287
dataset_size: 847040
---
# Dataset Card for "MultiMed_Doc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EnergyStarAI/ASR | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 193898056.0
num_examples: 1000
download_size: 189589875
dataset_size: 193898056.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
NeuroBench/mackey_glass | ---
license: cc-by-4.0
---
Pre-generated numpy arrays of MackeyGlass time series, generated with the [jitcdde](https://jitcdde.readthedocs.io/en/stable/) library.
Please note that due to lower-level solvers used in the library, different machines, even with the same ISA and library versions, may produce different data.
Thus, please use the pre-generated data included here.
The dataset contains 14 time series, each uses MG parameters beta=0.2, gamma=0.1, n=10. tau is varied per time series from 17 to 30.
Each time series is 50 Lyapunov times in length, with 75 points per Lyapunov time. |
Rashedul12/Test123 | ---
license: openrail
---
TEST READ |
TaylorAI/FLAN-longer-400k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
splits:
- name: train
num_bytes: 688106586.2153635
num_examples: 400000
download_size: 451942436
dataset_size: 688106586.2153635
---
# Dataset Card for "FLAN-longer-400k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-b756be98-8935185 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: uygarkurt/distilbert-base-uncased-finetuned-emotion
metrics: []
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: uygarkurt/distilbert-base-uncased-finetuned-emotion
* Dataset: emotion
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
nglaura/koreascience-summarization | ---
license: apache-2.0
task_categories:
- summarization
language:
- fr
pretty_name: KoreaScience
---
# LoRaLay: A Multilingual and Multimodal Dataset for Long Range and Layout-Aware Summarization
A collaboration between [reciTAL](https://recital.ai/en/), [MLIA](https://mlia.lip6.fr/) (ISIR, Sorbonne Université), [Meta AI](https://ai.facebook.com/), and [Università di Trento](https://www.unitn.it/)
## KoreaScience dataset for summarization
KoreaScience is a dataset for summarization of research papers written in Korean, for which layout information is provided.
### Data Fields
- `article_id`: article id
- `article_words`: sequence of words constituting the body of the article
- `article_bboxes`: sequence of corresponding word bounding boxes
- `norm_article_bboxes`: sequence of corresponding normalized word bounding boxes
- `abstract`: a string containing the abstract of the article
- `article_pdf_url`: URL of the article's PDF
### Data Splits
This dataset has 3 splits: _train_, _validation_, and _test_.
| Dataset Split | Number of Instances |
| ------------- | --------------------|
| Train | 35,248 |
| Validation | 1,125 |
| Test | 1,125 |
## Citation
``` latex
@article{nguyen2023loralay,
title={LoRaLay: A Multilingual and Multimodal Dataset for Long Range and Layout-Aware Summarization},
author={Nguyen, Laura and Scialom, Thomas and Piwowarski, Benjamin and Staiano, Jacopo},
journal={arXiv preprint arXiv:2301.11312},
year={2023}
}
``` |
Braddy/xview_captions_v3 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
sequence: string
- name: file_id
dtype: string
splits:
- name: train
num_bytes: 94674025.0
num_examples: 949
download_size: 94634260
dataset_size: 94674025.0
---
# Dataset Card for "xview_captions_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pharaouk/glaive-code-assistant-v2 | ---
license: apache-2.0
size_categories:
- 100K<n<1M
tags:
- code
- synthetic
---
# Glaive-code-assistant-v2
Glaive-code-assistant-v2 is a dataset of ~215k code problems and solutions generated using Glaive’s synthetic data generation platform.
This is built on top of the previous version of the dataset that can be found [here](https://huggingface.co/datasets/glaiveai/glaive-code-assistant)
To report any problems or suggestions in the data, join the [Glaive discord](https://discord.gg/fjQ4uf3yWD) |
Tongjilibo/self_cognition | ---
license: apache-2.0
---
# 简介
- 从互联网搜集整理、各项模型调用返回得到的自我认知数据集,用于训练自己模型时候使用
# 数据来源
## 1. self_cognition数据来源
- [llama_factory](https://github.com/hiyouga/LLaMA-Factory/blob/main/data/identity.json)
- [jamesphe/self_cognition](https://huggingface.co/datasets/jamesphe/self_cognition)经清洗
- [wangrongsheng/self_cognition](https://huggingface.co/datasets/wangrongsheng/self_cognition/tree/main)经清洗
# 标记符解释
```text
<NAME>: 模型的名字
<COMPANY>: 模型的公司
<VERSION>: 模型的版本
<DATE>: 当前版本的发布日期
<DESCRIPTION>: 模型的描述,主要功能,价值观或理念
<ABILITY>: 模型的能力,使用范围
<LIMITATION>: 模型的限制、遵循的法规、道德标准或伦理准则
<AUTHOR>: 模型的作者、开发团队
<ROLE>: 模型的角色定义
``` |
heliosprime/twitter_dataset_1712983312 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 4588
num_examples: 10
download_size: 7800
dataset_size: 4588
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712983312"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eb/num25000 | ---
dataset_info:
features:
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 33500451.6
num_examples: 22500
- name: test
num_bytes: 3722272.4
num_examples: 2500
download_size: 21358658
dataset_size: 37222724.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CBrann/my_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: image
dtype: string
- name: conditioning_image
dtype: string
splits:
- name: train
num_bytes: 16159749
num_examples: 27511
download_size: 2622361
dataset_size: 16159749
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "my_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fimu-docproc-research/CIVQA_EasyOCR_LayoutLM_Train | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: bbox
dtype:
array2_d:
shape:
- 512
- 4
dtype: int32
- name: attention_mask
sequence: int32
- name: image
dtype:
array3_d:
shape:
- 3
- 224
- 224
dtype: int32
- name: start_positions
dtype: int32
- name: end_positions
dtype: int32
- name: questions
dtype: string
- name: answers
dtype: string
splits:
- name: train
num_bytes: 89021492745
num_examples: 143765
download_size: 913954164
dataset_size: 89021492745
license: mit
language:
- cs
tags:
- finance
---
# CIVQA EasyOCR LayoutLM Train Dataset
The CIVQA (Czech Invoice Visual Question Answering) dataset was created with EasyOCR, and it is encoded for LayoutLM models. This dataset contains only the train split. The validation part of the dataset can be found on this URL: https://huggingface.co/datasets/fimu-docproc-research/CIVQA_EasyOCR_LayoutLM_Validation
The pre-encoded train dataset can be found on this link: https://huggingface.co/datasets/fimu-docproc-research/CIVQA_EasyOCR_Train
All invoices used in this dataset were obtained from public sources. Over these invoices, we were focusing on 15 different entities, which are crucial for processing the invoices.
- Invoice number
- Variable symbol
- Specific symbol
- Constant symbol
- Bank code
- Account number
- ICO
- Total amount
- Invoice date
- Due date
- Name of supplier
- IBAN
- DIC
- QR code
- Supplier's address
The invoices included in this dataset were gathered from the internet. We understand that privacy is of utmost importance. Therefore, we sincerely apologise for any inconvenience caused by including your identifiable information in this dataset. If you have identified your data in this dataset and wish to have it removed from research purposes, we request you kindly to access the following URL: https://forms.gle/tUVJKoB22oeTncUD6
We profoundly appreciate your cooperation and understanding in this matter. |
rombodawg/LosslessMegaCodeTrainingV2 | ---
license: other
---
_________________________________________________________________________________
VERSION 3 IS RELEASED DOWNLOAD HERE:
- https://huggingface.co/datasets/rombodawg/LosslessMegaCodeTrainingV3_2.2m_Evol
_________________________________________________________________________________
Updated/Uncensored version 1 here: https://huggingface.co/datasets/rombodawg/2XUNCENSORED_MegaCodeTraining188k
Non-code instruct training here: https://huggingface.co/datasets/rombodawg/2XUNCENSORED_alpaca_840k_Evol_USER_ASSIS
Legacy version 1 code training here: https://huggingface.co/datasets/rombodawg/MegaCodeTraining200k
This is the ultimate code training data, created to be lossless so the AI model does not lose any other abilities that it had previously (such as logical skills) after training on this dataset.
The reason why this dataset is so large is so that as the model learns to code, it continues to remember to follow regular instructions as to not lose previously learned abilities.
This is the outcome of all my work gathering data, testing AI models, and discovering what, why, and how coding models do and don't perform well.
If non of this is making any sense think of it this way, I took the old MegaCoding dataset, added like 8x more data that is purely instruction based (non coding), then ran a script to remove a ton (literally 10's of thousands of lines of instructions) that was deemed to be censored. This dataset is the result of that process.
This dataset is the combination of my 2 previous datasets found below:
Coding:
https://huggingface.co/datasets/rombodawg/2XUNCENSORED_MegaCodeTraining188k
Instruction following:
https://huggingface.co/datasets/rombodawg/2XUNCENSORED_alpaca_840k_Evol_USER_ASSIST |
EarthnDusk/FloraFauna_Dataset | ---
license: creativeml-openrail-m
---
|
Maljean/dataset | ---
license: apache-2.0
---
|
haiyan1/qizhikejihaha | ---
license: apache-2.0
task_categories:
- image-classification
- text-classification
language:
- zh
tags:
- 那你
- medical
- chemistry
- biology
- finance
- music
- art
- legal
- code
- climate
- not-for-all-audiences
- xx
- ssss
- xxss
- sss
- swwww
- wwwww
- wwww
- 我1
- '11'
- '22'
- '333'
- '444'
- '555'
- '666'
- '777'
- '6777'
- '7777'
size_categories:
- n<1K
pretty_name: 很好
---
很棒 |
perrynelson/waxal-pilot-wolof | ---
dataset_info:
features:
- name: input_values
sequence: float32
- name: labels
sequence: int64
splits:
- name: test
num_bytes: 1427656040
num_examples: 1075
- name: train
num_bytes: 659019824
num_examples: 501
- name: validation
num_bytes: 1075819008
num_examples: 803
download_size: 3164333891
dataset_size: 3162494872
---
# Dataset Card for "waxal-pilot-wolof"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dialect-ai/shironaam | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-generation
- summarization
- sentence-similarity
- text2text-generation
language:
- bn
tags:
- headline-generation
- low-resource
- information-extraction
- news-clustering
- keyword-identification
- document-categorization
size_categories:
- 100K<n<1M
---
# Dataset Card for Shironaam Corpus
## Dataset Description
- **Homepage:**
- **Repository:** https://github.com/dialect-ai/BenHeadGen
- **Paper:** https://aclanthology.org/2023.eacl-main.4/
- **Leaderboard:**
- **Point of Contact:** [Abu Ubaida Akash](mailto:akash.ubaida@gmail.com)
### Dataset Summary
Automatic headline generation systems have the potential to assist editors in finding interesting headlines to attract visitors or readers.
However, the performance of headline generation systems remains challenging due to the unavailability of sufficient parallel data for
low-resource languages like Bengali. We provide **Shironaam**, a large-scale news headline generation dataset of a low-resource language
_i.e._, Bengali containing over 240K news headline-article pairings with auxiliary information such as image captions, topic words,
and category information. Also, this dataset can potentially be used for other tasks such as document categorization, news clustering,
keyword identification, _etc._ [(read more)](https://aclanthology.org/2023.eacl-main.4.pdf).
<!---
### Supported Tasks and Leaderboards
[More Information Needed]
-->
### Language(s)
Bengali
## Dataset Structure
### Data Instances
One example from the test split of the dataset is given below in JSON format.
```
{
"news_link": https://www.ajkerpatrika.com/169885/%E0%A6%AA%E0%A6%B0%E0%A6%BF%E0%A6%AC%E0%A7%87%E0%A6%B6%E0%A6%A6%E0%A7%82%E0%A6%B7%E0%A6%A3%E0%A7%87-%E0%A6%AC%E0%A7%8D%E0%A6%AF%E0%A6%BE%E0%A6%A7%E0%A6%BF-%E0%A6%AC%E0%A6%BE%E0%A7%9C%E0%A6%9B%E0%A7%87-%E0%A6%B8%E0%A7%8D%E0%A6%AC%E0%A6%BE%E0%A6%B8%E0%A7%8D%E0%A6%A5%E0%A7%8D%E0%A6%AF%E0%A6%AE%E0%A6%A8%E0%A7%8D%E0%A6%A4%E0%A7%8D%E0%A6%B0%E0%A7%80,
"head_lines": পরিবেশদূষণে ব্যাধি বাড়ছে: স্বাস্থ্যমন্ত্রী,
"article": স্বাস্থ্য ও পরিবারকল্যাণমন্ত্রী জাহিদ মালেক বলেছেন, প্রতিনিয়ত বিশ্বে পরিবেশ দূষিত হচ্ছে। এতে নতুন নতুন রোগের সৃষ্টি হচ্ছে। পরিবেশদূষণের কারণে ১৫-২০ শতাংশ মানসিক রোগী বাড়ছে। বিশ্ব স্বাস্থ্য দিবস উপলক্ষে আজ বৃহস্পতিবার রাজধানীর ওসমানী স্মৃতি মিলনায়তনে আয়োজিত এক অনুষ্ঠানে তিনি এসব কথা বলেন।স্বাস্থ্যমন্ত্রী বলেন, 'বর্তমানে পরিবেশ, পানি দূষিত হচ্ছে। দেশের পরিবেশ ভালো থাকলে কৃষি, পানি, স্বাস্থ্য ভালো থাকবে এবং চাপ কম থাকবে। এগুলো ভালো রাখতে হবে, তবেই আমরা ভালো থাকব।'জাহিদ মালেক বলেন, কলকারখানার গ্যাস ও যানবাহনের দূষিত ধোঁয়া পরিবেশ নষ্ট করছে। এতে ডায়রিয়া, কলেরা, চিকুনগুনিয়াসহ নানা নতুন-পুরোনো রোগ দেখা দিচ্ছে। দেশের অন্যান্য স্থানের চেয়ে ঢাকায় বায়ুদূষণ বেশি হচ্ছে। দেশে যে পরিমাণ বনাঞ্চল থাকার কথা, তা নেই।পরিবেশ ধ্বংসে বাংলাদেশের হাত না থাকলেও সবচেয়ে বেশি ক্ষতির মুখে পড়তে হয় মন্তব্য করে স্বাস্থ্যমন্ত্রী বলেন, বিশ্বে প্রতিবছর ৬০ হাজার হেক্টর বন ধ্বংস হচ্ছে। পরিবেশ ধ্বংসে যুক্তরাষ্ট্র, ব্রাজিল ও ইউরোপের দেশগুলোর বড় ভূমিকা থাকলেও বাংলাদেশের মতো দেশগুলোকে প্রভাব মোকাবিলা করতে হয়।পানি সমস্যার কারণে ডায়রিয়া বাড়ছে জানিয়ে জাহিদ মালেক বলেন, পানি সমস্যার সমাধান করতে হবে। এর কারণে ডায়রিয়া, কলেরাসহ অন্যান্য রোগ বেড়েই চলেছে। ভেজাল খাদ্যের কারণে সংক্রামক ও অসংক্রামক রোগ বাড়ছে। তবে আমাদের স্বাস্থ্য ব্যবস্থাপনাও ভালো রাখতে হবে। দেশকে ভালো রাখতে হলে দেশের সম্পদ ঠিক রাখতে হবে।দেশের অন্যান্য উন্নয়নের পাশাপাশি স্বাস্থ্যব্যবস্থারও অনেক উন্নতি হয়েছে জানিয়ে স্বাস্থ্যমন্ত্রী বলেন, 'আমাদের গড় আয়ু এখন ৭৩ বছর। ভ্যাকসিনেও আমরা অনেক ভালো করেছি, বিশ্বে অষ্টম হয়েছি। লক্ষ্যমাত্রার ৯৫ ভাগ মানুষকে টিকা দিয়েছি। ভালো কাজ করেছি বিধায় জিডিপি এখনো সাতে রয়েছে। পাশের শ্রীলঙ্কা এখন দেউলিয়া, তারা হয়তো ভালো ব্যবস্থা নিতে পারেনি। কিন্তু আমাদের খাদ্যে কোনো ঘাটতি নেই। ৪৫ বিলিয়ন ডলার আমাদের রিজার্ভ রয়েছে। মাথাপিছু ঋণ অনেক দেশের তুলনায় কম রয়েছে।',
"tags": স্বাস্থ্যমন্ত্রী,রাজধানী,পরিবেশ দূষণ,জাহিদ মালেক,
"image_caption": অনুষ্ঠানে বক্তব্য দেন স্বাস্থ্য ও পরিবার কল্যাণমন্ত্রী জাহিদ মালেক।,
"category": national
}
```
### Data Fields
- `news_link`: A string representing the link of the news source
- `head_lines`: A string representing the headline of the corresponding news article
- `article`: A string representing the article body of the news
- `tags`: A string representing the tags/topic-words related to the corresponding news article
- `image_caption`: A string representing the caption(s) of the images from the corresponding news article
- `category`: A string representing the category the corresponding news belongs to
### Data Splits
The **Shironaam** dataset distribution over 13 different domains. After preprocessing the raw corpus, we have 240,580 news samples
as a tuple of (headline, article, image caption, topic words, category). To ensure a balanced distribution, we maintain the ratio
of (92% - 220,574), (2% - 4994), and (6% - 15,012) samples from all the categories to construct the train, validation, and
test set, respectively.
| **Category** | **Train** | **Valid** | **Test** | **Total** |
|:-------------:|:-----------:|:---------:|:----------:|:-----------:|
| Entertainment | 16,104 | 365 | 1095 | 17,565 |
| National | 117,566 | 2,664 | 7,994 | 128,226 |
| Nature | 467 | 10 | 31 | 510 |
| International | 30,558 | 692 | 2,078 | 33,329 |
| Sports | 17,635 | 399 | 1,199 | 19,235 |
| Economy | 6,447 | 146 | 438 | 7,032 |
| Life-Health | 6,356 | 144 | 432 | 6,933 |
| Miscellaneous | 1,599 | 36 | 108 | 1,744 |
| Opinion | 3,501 | 79 | 238 | 3,819 |
| Politics | 15,018 | 340 | 1,021 | 16,380 |
| Edu-Career | 4,008 | 90 | 272 | 4,372 |
| Science-Tech | 1,046 | 23 | 71 | 1,141 |
| Religion | 269 | 6 | 18 | 294 |
| **Total** | **220,574** | **4,994** | **15,012** | **240,580** |
## Dataset Creation
We crawl around 900,000 raw data samples from seven famous Bengali newspapers concentrating on certain
criteria, such as headline, article, image caption, category, and topic words. Since each of the newspapers
mentioned above has its own professional authors and distinct writing style, we consider multiple sources
to prevent the bias of a particular annotation style. To ensure content diversity, we also cover various
domains from all the news dailies. The majority of the news samples are extracted from HTML bodies of the
corresponding publications, while some are rendered using JavaScript. However, two of them do not provide
the archives on their websites; therefore, we collect the samples through their APIs... [details in the paper](https://aclanthology.org/2023.eacl-main.4.pdf)
<!---
### Curation Rationale
[More Information Needed]
-->
### Source Data
| **Newspaper** | **URL** |
|:-------------------:|:------------------------:|
| Prothom Alo | www.prothomalo.com |
| Naya Diganta | www.dailynayadiganta.com |
| Ajker Patrika | www.ajkerpatrika.com |
| Bangladesh Protidin | www.bd-pratidin.com |
| Samakal | www.samakal.com |
| Bhorer Kagoj | www.bhorerkagoj.com |
| Dhaka Tribune | www.dhakatribune.com |
<!---
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
-->
### Discussion of Ethics
We considered some ethical aspects while scraping the data. We requested data at a reasonable rate
without any intention of a DDoS attack. Moreover, for each website, we read the instructions listed in
robots.txt to check whether we can crawl the intended content. We tried to minimize offensive texts in
the data by explicitly crawling the sites where such contents are minimal. Further, we removed the
Personal Identifying Information (PII) such as name, phone number, email address, _etc._ from the corpus.
### Other Known Limitations
Our dataset relies on auxiliary information such as image captions and topic words to achieve superior
performance in generating news headlines. However, it is quite common to include images and extra information
(e.g., topic words) to increase the article’s visibility, support, and context. On top of that **Shironaam**
corpus supports only Bengali, a widely spoken but low-resource language. Still, this idea of using auxiliary
information to improve headline generation performance can easily be extendable for many languages.
## Additional Information
<!---
### Dataset Curators
[More Information Needed]
-->
### Licensing Information
Contents of this repository are restricted to only non-commercial research purposes under the [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0)](https://creativecommons.org/licenses/by-nc-sa/4.0/).
Copyright of the dataset contents belongs to the original copyright holders.
### Citation Information
If you find this work useful for your research, please consider citing:
```
@inproceedings{akash-etal-2023-shironaam,
title = "Shironaam: {B}engali News Headline Generation using Auxiliary Information",
author = "Akash, Abu Ubaida and
Nayeem, Mir Tafseer and
Shohan, Faisal Tareque and
Islam, Tanvir",
booktitle = "Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics",
month = may,
year = "2023",
address = "Dubrovnik, Croatia",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.eacl-main.4",
pages = "52--67"
}
```
### Contributors
- Abu Ubaida Akash (akash.ubaida@gmail.com)
- Mir Tafseer Nayeem (mnayeem@ualberta.ca)
- Faisal Tareque Shohan (faisaltareque@hotmail.com)
- Tanvir Islam (tislam@hawaii.edu)
### Acknowledgements
- This work is the outcome of the ongoing research at [Dialect AI Research Group](https://github.com/dialect-ai).
- Mir Tafseer Nayeem is supported by [Huawei](https://digitalpower.huawei.com/en/) Doctoral Fellowship. |
AigizK/bashkir-russian-parallel-corpora | ---
language:
- ba
- ru
license: cc-by-4.0
task_categories:
- translation
dataset_info:
features:
- name: ba
dtype: string
- name: ru
dtype: string
- name: corpus
dtype: string
splits:
- name: train
num_bytes: 409240581
num_examples: 1093189
download_size: 195923641
dataset_size: 409240581
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "bashkir-russian-parallel-corpora"
### How the dataset was assembled.
1. find the text in two languages. it can be a translated book or an internet page (wikipedia, news site)
2. our algorithm tries to match Bashkir sentences with their translation in Russian
3. We give these pairs to people to check
```
@inproceedings{
title={Bashkir-Russian parallel corpora},
author={Iskander Shakirov, Aigiz Kunafin},
year={2023}
}
``` |
kimnt93/vi-new-instruction | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: instruction_type
dtype: string
splits:
- name: train
num_bytes: 90095
num_examples: 633
download_size: 48833
dataset_size: 90095
---
vi: https://github.com/XueFuzhao/InstructionWild/ + https://github.com/yizhongw/self-instruct |
dhiruHF/occupation-classifier-2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 459139
num_examples: 3552
download_size: 108586
dataset_size: 459139
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "occupation-classifier-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mayflowergmbh/tinyMMLU_de | ---
language:
- de
---
# Dataset Card for https://huggingface.co/datasets/mayflowergmbh/tinyMMLU_de
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
German tinyMMLU translation.
### Dataset Description
This dataset is an AzureML translation of [tinyBenchmarks/tinyMMLU](https://huggingface.co/datasets/tinyBenchmarks/tinyMMLU)
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository: [tinyBenchmarks/tinyMMLU](https://huggingface.co/datasets/tinyBenchmarks/tinyMMLU)
- **Paper:[tinyBenchmarks: evaluating LLMs with fewer examples](https://huggingface.co/papers/2402.14992)
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
Please see the documentation of the [original dataset](https://huggingface.co/datasets/tinyBenchmarks/tinyMMLU) |
BRlkl/PAD | ---
license: openrail
---
|
open-llm-leaderboard/details_huggyllama__llama-13b | ---
pretty_name: Evaluation run of huggyllama/llama-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [huggyllama/llama-13b](https://huggingface.co/huggyllama/llama-13b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 122 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the agregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggyllama__llama-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T10:41:44.150256](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-13b/blob/main/results_2023-09-23T10-41-44.150256.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n\
\ \"em_stderr\": 0.000456667646266702,\n \"f1\": 0.056602348993288636,\n\
\ \"f1_stderr\": 0.0013004668300984712,\n \"acc\": 0.4191229752993855,\n\
\ \"acc_stderr\": 0.009626252314482865\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.000456667646266702,\n\
\ \"f1\": 0.056602348993288636,\n \"f1_stderr\": 0.0013004668300984712\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0758150113722517,\n \
\ \"acc_stderr\": 0.007291205723162579\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803152\n\
\ }\n}\n```"
repo_url: https://huggingface.co/huggyllama/llama-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|arc:challenge|25_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|arc:challenge|25_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T10_41_44.150256
path:
- '**/details_harness|drop|3_2023-09-23T10-41-44.150256.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T10-41-44.150256.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T10_41_44.150256
path:
- '**/details_harness|gsm8k|5_2023-09-23T10-41-44.150256.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T10-41-44.150256.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hellaswag|10_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hellaswag|10_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:13:44.970123.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T22:15:08.436043.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T15:13:44.970123.parquet'
- split: 2023_08_19T22_15_08.436043
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T22:15:08.436043.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T22:15:08.436043.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T10_41_44.150256
path:
- '**/details_harness|winogrande|5_2023-09-23T10-41-44.150256.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T10-41-44.150256.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T19:54:33.085163.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:management|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:virology|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T19:54:33.085163.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T19_54_33.085163
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:54:33.085163.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:54:33.085163.parquet'
- config_name: results
data_files:
- split: 2023_07_24T15_13_44.970123
path:
- results_2023-07-24T15:13:44.970123.parquet
- split: 2023_08_19T22_15_08.436043
path:
- results_2023-08-19T22:15:08.436043.parquet
- split: 2023_08_28T19_54_33.085163
path:
- results_2023-08-28T19:54:33.085163.parquet
- split: 2023_09_23T10_41_44.150256
path:
- results_2023-09-23T10-41-44.150256.parquet
- split: latest
path:
- results_2023-09-23T10-41-44.150256.parquet
---
# Dataset Card for Evaluation run of huggyllama/llama-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/huggyllama/llama-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [huggyllama/llama-13b](https://huggingface.co/huggyllama/llama-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huggyllama__llama-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T10:41:44.150256](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-13b/blob/main/results_2023-09-23T10-41-44.150256.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.000456667646266702,
"f1": 0.056602348993288636,
"f1_stderr": 0.0013004668300984712,
"acc": 0.4191229752993855,
"acc_stderr": 0.009626252314482865
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.000456667646266702,
"f1": 0.056602348993288636,
"f1_stderr": 0.0013004668300984712
},
"harness|gsm8k|5": {
"acc": 0.0758150113722517,
"acc_stderr": 0.007291205723162579
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803152
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Nexdata/302_Person_Hindi_and_English_Bilingual_Spontaneous_Monologue_smartphone_speech_dataset | ---
license: cc-by-nc-nd-4.0
---
## Description
Hindi and English Bilingual Spontaneous Monologue smartphone speech dataset, collected from dialogues based on given topics, covering generic domain. Our dataset was collected from extensive and diversify speakers(302 people in total, ages 18 to 46), geographicly speaking, enhancing model performance in real and complex tasks. Quality tested by various AI companies. We strictly adhere to data protection regulations and privacy standards, ensuring the maintenance of user privacy and legal rights throughout the data collection, storage, and usage processes, our datasets are all GDPR, CCPA, PIPL complied.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1420?source=Huggingface
## Format
16k Hz, 16 bit, wav, mono channel
## Content category
Individuals naturally speaking, with no specific content limitations. Each speaker records 20 audios in each language (40 recordings per person), each recording lasting about 10-20 seconds
## Recording condition
Quiet indoor environment, without echoes, background voices, obvious noises
## Recording device
Android phone, iPhone
## Speaker
Total 302 contributors,45% male and 55% female. 291contributors aged 18-37, 10 contributors aged 38-45, and 1 contributor aged 46-65
## Country
India(IND)
## Language
Hindi,English
# Licensing Information
Commercial License
|
FanChen0116/bus_few4_128x_empty | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-from_location
'2': B-from_location
'3': B-leaving_date
'4': I-leaving_date
'5': I-to_location
'6': B-to_location
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 1560019
num_examples: 8960
- name: validation
num_bytes: 6128
num_examples: 35
- name: test
num_bytes: 70618
num_examples: 377
download_size: 0
dataset_size: 1636765
---
# Dataset Card for "bus_few4_128x_empty"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
worldboss/qa_nia_faq_chat | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 36450
num_examples: 66
download_size: 20652
dataset_size: 36450
---
# Dataset Card for "qa_nia_faq_chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carlesoctav/en-id-parallel-sentences | ---
dataset_info:
features:
- name: text_en
dtype: string
- name: text_id
dtype: string
splits:
- name: msmarcoquery
num_bytes: 41010003
num_examples: 500000
- name: combinedtech
num_bytes: 44901963
num_examples: 276659
- name: msmarcocollection
num_bytes: 351086941
num_examples: 500000
- name: TED2020
num_bytes: 32590228
num_examples: 163319
- name: Tatoeba
num_bytes: 797670
num_examples: 10543
- name: NeuLabTedTalks
num_bytes: 19440416
num_examples: 94224
- name: QED
num_bytes: 40115874
num_examples: 274581
- name: tico19
num_bytes: 959990
num_examples: 3071
download_size: 282831590
dataset_size: 530903085
---
# Dataset Card for "en-id-parallel-sentences"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DynamicSuperbPrivate/SpeechDetection_Voxceleb1Train | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: text
dtype: string
- name: instruction
dtype: string
- name: label
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 3188741275.0
num_examples: 12000
- name: validation
num_bytes: 733987727.88
num_examples: 2609
download_size: 3909471035
dataset_size: 3922729002.88
---
# Dataset Card for "SpeechDetection_VoxCeleb1Train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johannes-garstenauer/embeddings_from_distilbert_class_heaps_and_eval_part0_test | ---
dataset_info:
features:
- name: struct
dtype: string
- name: label
dtype: int64
- name: pred
dtype: int64
- name: cls_layer_6
sequence: float32
- name: cls_layer_5
sequence: float32
- name: cls_layer_4
sequence: float32
splits:
- name: train
num_bytes: 13428556
num_examples: 1408
download_size: 16665816
dataset_size: 13428556
---
# Dataset Card for "embeddings_from_distilbert_class_heaps_and_eval_part0_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_78_1713094178 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3263976
num_examples: 8268
download_size: 1667104
dataset_size: 3263976
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Th34/hgfjksloi | ---
license: openrail
---
|
per_sent | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- extended|other-MPQA-KBP Challenge-MediaRank
task_categories:
- text-classification
task_ids:
- sentiment-classification
paperswithcode_id: persent
pretty_name: PerSenT
dataset_info:
features:
- name: DOCUMENT_INDEX
dtype: int64
- name: TITLE
dtype: string
- name: TARGET_ENTITY
dtype: string
- name: DOCUMENT
dtype: string
- name: MASKED_DOCUMENT
dtype: string
- name: TRUE_SENTIMENT
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph0
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph1
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph2
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph3
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph4
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph5
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph6
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph7
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph8
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph9
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph10
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph11
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph12
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph13
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph14
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: Paragraph15
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
splits:
- name: train
num_bytes: 14595163
num_examples: 3355
- name: test_random
num_bytes: 2629500
num_examples: 579
- name: test_fixed
num_bytes: 3881800
num_examples: 827
- name: validation
num_bytes: 2322922
num_examples: 578
download_size: 23117196
dataset_size: 23429385
---
# Dataset Card for PerSenT
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [PerSenT](https://stonybrooknlp.github.io/PerSenT/)
- **Repository:** [https://github.com/MHDBST/PerSenT](https://github.com/MHDBST/PerSenT)
- **Paper:** [arXiv](https://arxiv.org/abs/2011.06128)
- **Leaderboard:** NA
- **Point of Contact:** [Mohaddeseh Bastan](mbastan@cs.stonybrook.edu)
### Dataset Summary
PerSenT is a crowd-sourced dataset that captures the sentiment of an author towards the main entity in a news article. This dataset contains annotations for 5.3k documents and 38k paragraphs covering 3.2k unique entities. For each article, annotators judge what the author’s sentiment is towards the main
(target) entity of the article. The annotations also include similar judgments on paragraphs within the article.
### Supported Tasks and Leaderboards
Sentiment Classification: Each document consists of multiple paragraphs. Each paragraph is labeled separately (Positive, Neutral, Negative) and the author’s sentiment towards the whole document is included as a document-level label.
### Languages
English
## Dataset Structure
### Data Instances
```json
{'DOCUMENT': "Germany's Landesbank Baden Wuertemberg won EU approval Tuesday for a state bailout after it promised to shrink its balance sheet by 40 percent and refocus on lending to companies.\n The bank was several state-owned German institutions to run into trouble last year after it ran up more huge losses from investing in high-risk proprietary trading and capital market activities -- a business the EU has now told it to shun.\n Seven current and former managers of the bank are also being investigated by German authorities for risking or damaging the bank's capital by carrying out or failing to block investments in high-risk deals worth hundreds of millions from 2006.\n The European Commission said its Tuesday approval for the state rescue of the bank and its new restructuring plan would allow it become a viable business again -- and that the cutbacks would help limit the unfair advantage over rivals that the bank would get from the state aid.\n Stuttgart-based LBBW earlier this year received a capital injection of (EURO)5 billion from the bank's shareholders all of them public authorities or state-owned including the state of Baden-Wuerttemberg the region's savings bank association and the city of Stuttgart.",
'DOCUMENT_INDEX': 1,
'MASKED_DOCUMENT': "[TGT] won EU approval Tuesday for a state bailout after it promised to shrink its balance sheet by 40 percent and refocus on lending to companies.\n [TGT] was several state-owned German institutions to run into trouble last year after [TGT] ran up more huge losses from investing in high-risk proprietary trading and capital market activities -- a business the EU has now told it to shun.\n Seven current and former managers of [TGT] are also being investigated by German authorities for risking or damaging [TGT]'s capital by carrying out or failing to block investments in high-risk deals worth hundreds of millions from 2006.\n The European Commission said its Tuesday approval for the state rescue of [TGT] and its new restructuring plan would allow it become a viable business again -- and that the cutbacks would help limit the unfair advantage over rivals that [TGT] would get from the state aid.\n Stuttgart-based LBBW earlier this year received a capital injection of (EURO)5 billion from [TGT]'s shareholders all of them public authorities or state-owned including the state of Baden-Wuerttemberg the region's savings bank association and the city of Stuttgart.",
'Paragraph0': 2,
'Paragraph1': 0,
'Paragraph10': -1,
'Paragraph11': -1,
'Paragraph12': -1,
'Paragraph13': -1,
'Paragraph14': -1,
'Paragraph15': -1,
'Paragraph2': 0,
'Paragraph3': 1,
'Paragraph4': 1,
'Paragraph5': -1,
'Paragraph6': -1,
'Paragraph7': -1,
'Paragraph8': -1,
'Paragraph9': -1,
'TARGET_ENTITY': 'Landesbank Baden Wuertemberg',
'TITLE': 'German bank LBBW wins EU bailout approval',
'TRUE_SENTIMENT': 0}
```
### Data Fields
- DOCUMENT_INDEX: ID of the document per original dataset
- TITLE: Title of the article
- DOCUMENT: Text of the article
- MASKED_DOCUMENT: Text of the article with the target entity masked with `[TGT]` token
- TARGET_ENTITY: The entity that the author is expressing opinion about
- TRUE_SENTIMENT: Label for entire article
- Paragraph{0..15}: Label for each paragraph in the article
**Note**: Labels are one of `[Negative, Neutral, Positive]`. Missing labels were replaced with `-1`.
### Data Splits
To split the dataset, entities were split into 4 mutually exclusive sets. Due to the nature of news collections, some entities tend to dominate the collection. In the collection, there were four entities which were the main entity in nearly 800 articles. To avoid these entities from dominating the train or test splits, these were moved them to a separate test collection. The remaining was split into a training, dev, and test sets at random. Thus the collection includes one standard test set consisting of articles drawn at random (Test Standard), while the other is a test set which contains multiple articles about a small number of popular entities (Test Frequent).
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
Articles were selected from 3 sources:
1. MPQA (Deng and Wiebe, 2015; Wiebe et al., 2005): This dataset contains news articles manually annotated for opinions, beliefs, emotions, sentiments, speculations, etc. It also has target annotations which are entities and event anchored to the heads of noun or verb phrases. All decisions on this dataset are made on sentence-level and over short spans.
2. KBP Challenge (Ellis et al., 2014): This resource contains TAC 2014 KBP English sentiment slot filling challenge dataset. This is a document-level sentiment filling dataset. In this task, given an entity and a sentiment (positive/negative) from the document, the goal is to find entities toward which
the original entity holds the given sentimental view. We selected documents from this resource which have been used in the following similar work in sentiment analysis task (Choi et al., 2016).
3. Media Rank (Ye and Skiena, 2019): This dataset ranks about 50k news sources along different aspects. It is also used for classifying political ideology of news articles (Kulkarni et al., 2018).
Pre-processing steps:
- First we find all the person entities in each article, using Stanford NER (Name Entity Resolution) tagger (Finkel et al., 2005) and all mentions of them using co-reference resolution (Clark and Manning, 2016; Co, 2017).
- We removed articles which are not likely to have a main entity of focus. We used a simple heuristic of removing articles in which the most frequent person entity is mentioned only three times or less (even when counting co-referent mentions).
- For the articles that remain we deemed the most frequent entity to be the main entity of the article. We also filtered out extremely long and extremely short articles to keep the articles which have at least 3 paragraphs and at most 16 paragraphs.
Documents are randomly separated into train, dev, and two test sets. We ensure that each entity appears in only one of the sets. Our goal here is to avoid easy to learn biases over entities. To avoid the most frequent entities from dominating the training or the test sets, we remove articles that covered the most frequent entities and use them as a separate test set (referred to as frequent test set) in addition to the randomly drawn standard test set.
### Annotations
#### Annotation process
We obtained document and paragraph level annotations with the help of Amazon Mechanical Turk workers. The workers first verified if the target entity we provide is indeed the main entity in the document. Then, they rated each paragraph in a document that contained a direct mention or a reference to the target
entity. Last, they rated the sentiment towards the entity based on the entire document. In both cases, the workers made assessments about the authors view based on what they said about the target entity. For both paragraph and document level sentiment, the workers chose from five rating categories: Negative,
Slightly Negative, Neutral, Slightly Positive, or Positive. We then combine the fine-grained annotations to obtain three coarse-grained classes Negative, Neutral, or Positive.
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
[More Information Needed]
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[Creative Commons Attribution 4.0 International License](http://creativecommons.org/licenses/by/4.0/)
### Citation Information
```
@inproceedings{bastan2020authors,
title={Author's Sentiment Prediction},
author={Mohaddeseh Bastan and Mahnaz Koupaee and Youngseo Son and Richard Sicoli and Niranjan Balasubramanian},
year={2020},
eprint={2011.06128},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@jeromeku](https://github.com/jeromeku) for adding this dataset. |
Chaidi/text-topic-classification | ---
license: apache-2.0
task_categories:
- text-classification
language:
- ch
tags:
- art
pretty_name: p
size_categories:
- n<1K
--- |
lmg-anon/VNTL-v3-1k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
dataset_info:
features:
- name: text
dtype: string
- name: ignore_loss
sequence: int64
splits:
- name: train
num_bytes: 26306600
num_examples: 10939
- name: val
num_bytes: 3872937
num_examples: 1639
download_size: 13652180
dataset_size: 30179537
---
# Dataset Card for "VNTL-v3-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xl_mode_T_A_D_PNP_FILTER_C_OCR_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_with_openai_Attributes_ViT_L_14_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_text
num_bytes: 9989438
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_text
num_bytes: 9985757
num_examples: 1000
download_size: 3297008
dataset_size: 19975195
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xl_mode_T_A_D_PNP_FILTER_C_OCR_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deetsadi/processed_dwi_sobel_with_adc | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: conditioning_image
dtype: image
splits:
- name: train
num_bytes: 40077622.0
num_examples: 200
download_size: 40079175
dataset_size: 40077622.0
---
# Dataset Card for "processed_dwi_sobel_with_adc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hopee4/cariucha | ---
license: openrail
---
|
mteb/cqadupstack-english | ---
language:
- en
multilinguality:
- monolingual
task_categories:
- text-retrieval
source_datasets:
- cqadupstack-english
task_ids:
- document-retrieval
config_names:
- corpus
tags:
- text-retrieval
dataset_info:
- config_name: default
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: float64
splits:
- name: test
num_bytes: 100171
num_examples: 3765
- config_name: corpus
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_bytes: 20194221
num_examples: 40221
- config_name: queries
features:
- name: _id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_bytes: 97308
num_examples: 1570
configs:
- config_name: default
data_files:
- split: test
path: qrels/test.jsonl
- config_name: corpus
data_files:
- split: corpus
path: corpus.jsonl
- config_name: queries
data_files:
- split: queries
path: queries.jsonl
--- |
tyzhu/fw_num_train_1000_eval_100 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 137382
num_examples: 2100
- name: eval_find_word
num_bytes: 4723
num_examples: 100
download_size: 58570
dataset_size: 142105
---
# Dataset Card for "fw_num_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_81_1713120565 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 330719
num_examples: 835
download_size: 169183
dataset_size: 330719
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
microsoft/LCC_csharp | ---
dataset_info:
features:
- name: context
dtype: string
- name: gt
dtype: string
splits:
- name: train
num_bytes: 1851797668
num_examples: 100000
- name: validation
num_bytes: 136620599
num_examples: 10000
- name: test
num_bytes: 136701413
num_examples: 10000
download_size: 581666513
dataset_size: 2125119680
---
# Dataset Card for "LCC_csharp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alluethrenn/NASA_Datasets | ---
license: mit
---
|
ic-fspml/stock_news_sentiment | ---
dataset_info:
features:
- name: ticker
dtype: string
- name: name
dtype: string
- name: type
dtype: string
- name: sector
dtype: string
- name: article_date
dtype: timestamp[ns, tz=UTC]
- name: article_headline
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 31727430
num_examples: 200998
- name: validation
num_bytes: 3172024
num_examples: 20100
- name: test
num_bytes: 4753186
num_examples: 30150
download_size: 20803817
dataset_size: 39652640
---
# Dataset Card for "stock_news_sentiment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jack4444b/ALP_Behavioral_ECON_QA | ---
license: mit
---
|
n1ghtf4l1/super-collider | ---
license: mit
---
|
blanchon/ChaBuD_MSI | ---
language: en
license: unknown
task_categories:
- change-detection
pretty_name: ChaBuD MSI
tags:
- remote-sensing
- earth-observation
- geospatial
- satellite-imagery
- change-detection
- sentinel-2
dataset_info:
features:
- name: image1
dtype:
array3_d:
dtype: uint8
shape:
- 512
- 512
- 13
- name: image2
dtype:
array3_d:
dtype: uint8
shape:
- 512
- 512
- 13
- name: mask
dtype: image
splits:
- name: train
num_bytes: 2624716428.0
num_examples: 278
- name: validation
num_bytes: 736431228.0
num_examples: 78
download_size: 2232652835
dataset_size: 3361147656.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# ChaBuD MSI
<!-- Dataset thumbnail -->

<!-- Provide a quick summary of the dataset. -->
ChaBuD is a dataset for Change detection for Burned area Delineation and is used for the ChaBuD ECML-PKDD 2023 Discovery Challenge. This is the MSI version with 13 bands.
- **Paper:** https://doi.org/10.1016/j.rse.2021.112603
- **Homepage:** https://huggingface.co/spaces/competitions/ChaBuD-ECML-PKDD2023
## Description
<!-- Provide a longer summary of what this dataset is. -->
- **Total Number of Images**: 356
- **Bands**: 13 (MSI)
- **Image Size**: 512x512
- **Image Resolution**: 10m
- **Land Cover Classes**: 2
- **Classes**: no change, burned area
- **Source**: Sentinel-2
## Usage
To use this dataset, simply use `datasets.load_dataset("blanchon/ChaBuD_MSI")`.
<!-- Provide any additional information on how to use this dataset. -->
```python
from datasets import load_dataset
ChaBuD_MSI = load_dataset("blanchon/ChaBuD_MSI")
```
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
If you use the ChaBuD_MSI dataset in your research, please consider citing the following publication:
```bibtex
@article{TURKOGLU2021112603,
title = {Crop mapping from image time series: Deep learning with multi-scale label hierarchies},
journal = {Remote Sensing of Environment},
volume = {264},
pages = {112603},
year = {2021},
issn = {0034-4257},
doi = {https://doi.org/10.1016/j.rse.2021.112603},
url = {https://www.sciencedirect.com/science/article/pii/S0034425721003230},
author = {Mehmet Ozgur Turkoglu and Stefano D'Aronco and Gregor Perich and Frank Liebisch and Constantin Streit and Konrad Schindler and Jan Dirk Wegner},
keywords = {Deep learning, Recurrent neural network (RNN), Convolutional RNN, Hierarchical classification, Multi-stage, Crop classification, Multi-temporal, Time series},
}
```
|
signal-k/planets | ---
license: mit
---
|
nguyenminh871/orientdb_1_6_2 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: func
dtype: string
- name: target
dtype: bool
- name: project
dtype: string
splits:
- name: orientdb_1_6_2
num_bytes: 8348430
num_examples: 2098
download_size: 2141816
dataset_size: 8348430
---
# Dataset Card for "orientdb_1_6_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dnovak232/sql_create_context-v4-mssql-instruct_v1.0 | ---
dataset_info:
features:
- name: input
dtype: string
- name: schema
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 30026435
num_examples: 78285
download_size: 8752475
dataset_size: 30026435
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4 | ---
pretty_name: Evaluation run of alnrg2arg/blockchainlabs_7B_merged_test2_4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [alnrg2arg/blockchainlabs_7B_merged_test2_4](https://huggingface.co/alnrg2arg/blockchainlabs_7B_merged_test2_4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-20T09:52:41.122319](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4/blob/main/results_2024-01-20T09-52-41.122319.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.652927958678689,\n\
\ \"acc_stderr\": 0.0321169960910649,\n \"acc_norm\": 0.6519652759500019,\n\
\ \"acc_norm_stderr\": 0.03279242565970157,\n \"mc1\": 0.576499388004896,\n\
\ \"mc1_stderr\": 0.01729742144853475,\n \"mc2\": 0.6976711663625277,\n\
\ \"mc2_stderr\": 0.015093001598591628\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838793,\n\
\ \"acc_norm\": 0.735494880546075,\n \"acc_norm_stderr\": 0.012889272949313368\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7229635530770763,\n\
\ \"acc_stderr\": 0.004466200055292544,\n \"acc_norm\": 0.8886675960963951,\n\
\ \"acc_norm_stderr\": 0.0031390048159258633\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944423,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944423\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n\
\ \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n\
\ \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.576499388004896,\n\
\ \"mc1_stderr\": 0.01729742144853475,\n \"mc2\": 0.6976711663625277,\n\
\ \"mc2_stderr\": 0.015093001598591628\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.010184308214775777\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7043214556482184,\n \
\ \"acc_stderr\": 0.012570068947898772\n }\n}\n```"
repo_url: https://huggingface.co/alnrg2arg/blockchainlabs_7B_merged_test2_4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|arc:challenge|25_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|gsm8k|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hellaswag|10_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T09-52-41.122319.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T09-52-41.122319.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- '**/details_harness|winogrande|5_2024-01-20T09-52-41.122319.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-20T09-52-41.122319.parquet'
- config_name: results
data_files:
- split: 2024_01_20T09_52_41.122319
path:
- results_2024-01-20T09-52-41.122319.parquet
- split: latest
path:
- results_2024-01-20T09-52-41.122319.parquet
---
# Dataset Card for Evaluation run of alnrg2arg/blockchainlabs_7B_merged_test2_4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alnrg2arg/blockchainlabs_7B_merged_test2_4](https://huggingface.co/alnrg2arg/blockchainlabs_7B_merged_test2_4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T09:52:41.122319](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__blockchainlabs_7B_merged_test2_4/blob/main/results_2024-01-20T09-52-41.122319.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.652927958678689,
"acc_stderr": 0.0321169960910649,
"acc_norm": 0.6519652759500019,
"acc_norm_stderr": 0.03279242565970157,
"mc1": 0.576499388004896,
"mc1_stderr": 0.01729742144853475,
"mc2": 0.6976711663625277,
"mc2_stderr": 0.015093001598591628
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.735494880546075,
"acc_norm_stderr": 0.012889272949313368
},
"harness|hellaswag|10": {
"acc": 0.7229635530770763,
"acc_stderr": 0.004466200055292544,
"acc_norm": 0.8886675960963951,
"acc_norm_stderr": 0.0031390048159258633
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944423,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944423
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869649,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869649
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.576499388004896,
"mc1_stderr": 0.01729742144853475,
"mc2": 0.6976711663625277,
"mc2_stderr": 0.015093001598591628
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.010184308214775777
},
"harness|gsm8k|5": {
"acc": 0.7043214556482184,
"acc_stderr": 0.012570068947898772
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
manishiitg/custom-data-v2 | ---
dataset_info:
features:
- name: system
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: lang
dtype: string
- name: judgement
dtype: string
- name: rating
dtype: float64
- name: judgement_pending
dtype: bool
- name: rated_by
dtype: string
splits:
- name: train
num_bytes: 365886127
num_examples: 105220
download_size: 155415815
dataset_size: 365886127
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FinanceInc/auditor_sentiment | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
- sentiment-classification
paperswithcode_id: null
pretty_name: Auditor_Sentiment
---
# Dataset Card for Auditor Sentiment
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
## Dataset Description
Auditor review sentiment collected by News Department
- **Point of Contact:**
Talked to COE for Auditing, currently sue@demo.org
### Dataset Summary
Auditor sentiment dataset of sentences from financial news. The dataset consists of several thousand sentences from English language financial news categorized by sentiment.
### Supported Tasks and Leaderboards
Sentiment Classification
### Languages
English
## Dataset Structure
### Data Instances
```
"sentence": "Pharmaceuticals group Orion Corp reported a fall in its third-quarter earnings that were hit by larger expenditures on R&D and marketing .",
"label": "negative"
```
### Data Fields
- sentence: a tokenized line from the dataset
- label: a label corresponding to the class as a string: 'positive' - (2), 'neutral' - (1), or 'negative' - (0)
### Data Splits
A train/test split was created randomly with a 75/25 split
## Dataset Creation
### Curation Rationale
To gather our auditor evaluations into one dataset. Previous attempts using off-the-shelf sentiment had only 70% F1, this dataset was an attempt to improve upon that performance.
### Source Data
#### Initial Data Collection and Normalization
The corpus used in this paper is made out of English news reports.
#### Who are the source language producers?
The source data was written by various auditors.
### Annotations
#### Annotation process
This release of the auditor reviews covers a collection of 4840
sentences. The selected collection of phrases was annotated by 16 people with
adequate background knowledge on financial markets. The subset here is where inter-annotation agreement was greater than 75%.
#### Who are the annotators?
They were pulled from the SME list, names are held by sue@demo.org
### Personal and Sensitive Information
There is no personal or sensitive information in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
All annotators were from the same institution and so interannotator agreement
should be understood with this taken into account.
### Licensing Information
License: Demo.Org Proprietary - DO NOT SHARE
This dataset is based on the [financial phrasebank](https://huggingface.co/datasets/financial_phrasebank) dataset. |
RaivisDejus/latvian-text | ---
annotations_creators:
- found
language:
- lv
language_creators:
- found
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Latvian text dataset
size_categories:
- 10K<n<100K
source_datasets:
- extended|tilde_model
- extended|wikipedia
- extended|europarl_bilingual
tags:
- lv
- latvian
task_categories:
- automatic-speech-recognition
task_ids: []
---
# Latvian text dataset
Data set of latvian language texts. Intended for use in AI tool development, like speech recognition or spellcheckers
## Data sources used
* Latvian Wikisource articles - https://wikisource.org/wiki/Category:Latvian
* Literary works of Rainis - https://repository.clarin.lv/repository/xmlui/handle/20.500.12574/41
* Latvian Wikipedia articles - https://huggingface.co/datasets/joelito/EU_Wikipedias
* European Parliament Proceedings Parallel Corpus - https://huggingface.co/datasets/europarl_bilingual
* Tilde MODEL Corpus - Multilingual Open Data for European Languages - https://huggingface.co/datasets/tilde_model
To get Wikipedia dataset (197MB) run.
```
python tools/wikipedia/GetWikipedia.py
```
To get Europarl dataset (1.7GB) run.
```
python tools/europarl/GetEuroparl.py
```
To get Tilde dataset (834MB) run.
```
python tools/europarl/GetTilde.py
```
To combine all datasets run
```
sh combine-all.sh
```
To clean out some junk run.
```
sh clean.sh
```
Also maybe you want to remove duplocate lines. To do so run
```
sort lv.txt | uniq > lv-uniq.txt
```
## Notes
Possible future sources
* Parliament proceedings transcripts - https://www.saeima.lv/lv/transcripts
* Discussions of Latvian Wikipedia pages - https://lv.wikipedia.org/wiki/Special:AllPages
* Out of copyright books from LNB collection - https://data.gov.lv/dati/lv/dataset/gramatu-digitala-kolekcija
Data sets not used
* Web scrapes, as they tend to yield data from comments with improper spelling like "atrashanaas vieta" instead of "atrašanās vieta"
* Open Subtitles, as they contain data with improper spelling like "atrashanaas vieta" instead of "atrašanās vieta"
Possible issues:
* Data sets contain foreign language characters, like "蠻子" or cyrilic f.e. "Рига" |
jerome-white/alpaca-irt-stan | ---
license: cc-by-4.0
dataset_info:
features:
- name: parameter
dtype: string
- name: sample
dtype: int64
- name: value
dtype: float64
- name: chain
dtype: int64
- name: element
dtype: string
splits:
- name: train
num_bytes: 1806854850
num_examples: 9488450
download_size: 161482164
dataset_size: 1806854850
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ndorr16/RockingDuck | ---
license: gpl-3.0
---
|
Atipico1/NQ_preprocessed | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: query_embedding
sequence: float32
splits:
- name: train
num_bytes: 64558633
num_examples: 10000
- name: test
num_bytes: 23378336
num_examples: 3610
download_size: 77819218
dataset_size: 87936969
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
AlekseyKorshuk/davinci-pairwise-filtered | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 1517383530
num_examples: 93540
- name: test
num_bytes: 123825205
num_examples: 14391
download_size: 316920124
dataset_size: 1641208735
---
# Dataset Card for "davinci-pairwise-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_cyberagent__calm2-7b-chat | ---
pretty_name: Evaluation run of cyberagent/calm2-7b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cyberagent/calm2-7b-chat](https://huggingface.co/cyberagent/calm2-7b-chat) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cyberagent__calm2-7b-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-11T04:41:23.645738](https://huggingface.co/datasets/open-llm-leaderboard/details_cyberagent__calm2-7b-chat/blob/main/results_2023-12-11T04-41-23.645738.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.39379663191330316,\n\
\ \"acc_stderr\": 0.03433785284156447,\n \"acc_norm\": 0.39896146189258175,\n\
\ \"acc_norm_stderr\": 0.0351728212913433,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766367,\n \"mc2\": 0.4196186456267839,\n\
\ \"mc2_stderr\": 0.01433169483869778\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3609215017064846,\n \"acc_stderr\": 0.014034761386175458,\n\
\ \"acc_norm\": 0.40273037542662116,\n \"acc_norm_stderr\": 0.014332236306790147\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5070703047201752,\n\
\ \"acc_stderr\": 0.004989282516055394,\n \"acc_norm\": 0.68123879705238,\n\
\ \"acc_norm_stderr\": 0.004650438781745311\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.04060127035236397,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.04060127035236397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.43018867924528303,\n \"acc_stderr\": 0.030471445867183235,\n\
\ \"acc_norm\": 0.43018867924528303,\n \"acc_norm_stderr\": 0.030471445867183235\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206824,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206824\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n\
\ \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.3699421965317919,\n\
\ \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.03078373675774564,\n\
\ \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.03078373675774564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.023135287974325635,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.023135287974325635\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.36774193548387096,\n \"acc_stderr\": 0.02743086657997347,\n \"\
acc_norm\": 0.36774193548387096,\n \"acc_norm_stderr\": 0.02743086657997347\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.32019704433497537,\n \"acc_stderr\": 0.032826493853041504,\n \"\
acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.032826493853041504\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.48484848484848486,\n \"acc_stderr\": 0.03902551007374448,\n\
\ \"acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.03902551007374448\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.035402943770953675,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.035402943770953675\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5233160621761658,\n \"acc_stderr\": 0.03604513672442202,\n\
\ \"acc_norm\": 0.5233160621761658,\n \"acc_norm_stderr\": 0.03604513672442202\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3076923076923077,\n \"acc_stderr\": 0.023400928918310495,\n\
\ \"acc_norm\": 0.3076923076923077,\n \"acc_norm_stderr\": 0.023400928918310495\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.029597329730978103,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.029597329730978103\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.43486238532110094,\n \"acc_stderr\": 0.021254631465609273,\n \"\
acc_norm\": 0.43486238532110094,\n \"acc_norm_stderr\": 0.021254631465609273\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3287037037037037,\n \"acc_stderr\": 0.032036140846700596,\n \"\
acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.032036140846700596\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.45588235294117646,\n \"acc_stderr\": 0.03495624522015474,\n \"\
acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.03495624522015474\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4978902953586498,\n \"acc_stderr\": 0.032546938018020076,\n \
\ \"acc_norm\": 0.4978902953586498,\n \"acc_norm_stderr\": 0.032546938018020076\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.47085201793721976,\n\
\ \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.47085201793721976,\n\
\ \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5371900826446281,\n \"acc_stderr\": 0.04551711196104218,\n \"\
acc_norm\": 0.5371900826446281,\n \"acc_norm_stderr\": 0.04551711196104218\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n\
\ \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.4351851851851852,\n\
\ \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3803680981595092,\n \"acc_stderr\": 0.03814269893261837,\n\
\ \"acc_norm\": 0.3803680981595092,\n \"acc_norm_stderr\": 0.03814269893261837\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3592233009708738,\n \"acc_stderr\": 0.04750458399041693,\n\
\ \"acc_norm\": 0.3592233009708738,\n \"acc_norm_stderr\": 0.04750458399041693\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5170940170940171,\n\
\ \"acc_stderr\": 0.032736940493481824,\n \"acc_norm\": 0.5170940170940171,\n\
\ \"acc_norm_stderr\": 0.032736940493481824\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5070242656449553,\n\
\ \"acc_stderr\": 0.017878199003432217,\n \"acc_norm\": 0.5070242656449553,\n\
\ \"acc_norm_stderr\": 0.017878199003432217\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.40173410404624277,\n \"acc_stderr\": 0.02639410417764363,\n\
\ \"acc_norm\": 0.40173410404624277,\n \"acc_norm_stderr\": 0.02639410417764363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2860335195530726,\n\
\ \"acc_stderr\": 0.015113972129062138,\n \"acc_norm\": 0.2860335195530726,\n\
\ \"acc_norm_stderr\": 0.015113972129062138\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4150326797385621,\n \"acc_stderr\": 0.0282135041778241,\n\
\ \"acc_norm\": 0.4150326797385621,\n \"acc_norm_stderr\": 0.0282135041778241\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3954983922829582,\n\
\ \"acc_stderr\": 0.027770918531427838,\n \"acc_norm\": 0.3954983922829582,\n\
\ \"acc_norm_stderr\": 0.027770918531427838\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4104938271604938,\n \"acc_stderr\": 0.027371350925124764,\n\
\ \"acc_norm\": 0.4104938271604938,\n \"acc_norm_stderr\": 0.027371350925124764\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3262411347517731,\n \"acc_stderr\": 0.027968453043563168,\n \
\ \"acc_norm\": 0.3262411347517731,\n \"acc_norm_stderr\": 0.027968453043563168\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32985658409387225,\n\
\ \"acc_stderr\": 0.012008129938540472,\n \"acc_norm\": 0.32985658409387225,\n\
\ \"acc_norm_stderr\": 0.012008129938540472\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.40441176470588236,\n \"acc_stderr\": 0.029812630701569736,\n\
\ \"acc_norm\": 0.40441176470588236,\n \"acc_norm_stderr\": 0.029812630701569736\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.37254901960784315,\n \"acc_stderr\": 0.01955964680921593,\n \
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.01955964680921593\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.43636363636363634,\n\
\ \"acc_stderr\": 0.04750185058907297,\n \"acc_norm\": 0.43636363636363634,\n\
\ \"acc_norm_stderr\": 0.04750185058907297\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.49387755102040815,\n \"acc_stderr\": 0.03200682020163907,\n\
\ \"acc_norm\": 0.49387755102040815,\n \"acc_norm_stderr\": 0.03200682020163907\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5024875621890548,\n\
\ \"acc_stderr\": 0.03535490150137289,\n \"acc_norm\": 0.5024875621890548,\n\
\ \"acc_norm_stderr\": 0.03535490150137289\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.03809973084540217,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.03809973084540217\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5146198830409356,\n \"acc_stderr\": 0.03833185275213025,\n\
\ \"acc_norm\": 0.5146198830409356,\n \"acc_norm_stderr\": 0.03833185275213025\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766367,\n \"mc2\": 0.4196186456267839,\n\
\ \"mc2_stderr\": 0.01433169483869778\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6495659037095501,\n \"acc_stderr\": 0.013409047676670187\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04927975739196361,\n \
\ \"acc_stderr\": 0.005962150655812477\n }\n}\n```"
repo_url: https://huggingface.co/cyberagent/calm2-7b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|arc:challenge|25_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|gsm8k|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hellaswag|10_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T04-41-23.645738.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-11T04-41-23.645738.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- '**/details_harness|winogrande|5_2023-12-11T04-41-23.645738.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-11T04-41-23.645738.parquet'
- config_name: results
data_files:
- split: 2023_12_11T04_41_23.645738
path:
- results_2023-12-11T04-41-23.645738.parquet
- split: latest
path:
- results_2023-12-11T04-41-23.645738.parquet
---
# Dataset Card for Evaluation run of cyberagent/calm2-7b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/cyberagent/calm2-7b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [cyberagent/calm2-7b-chat](https://huggingface.co/cyberagent/calm2-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cyberagent__calm2-7b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-11T04:41:23.645738](https://huggingface.co/datasets/open-llm-leaderboard/details_cyberagent__calm2-7b-chat/blob/main/results_2023-12-11T04-41-23.645738.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.39379663191330316,
"acc_stderr": 0.03433785284156447,
"acc_norm": 0.39896146189258175,
"acc_norm_stderr": 0.0351728212913433,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766367,
"mc2": 0.4196186456267839,
"mc2_stderr": 0.01433169483869778
},
"harness|arc:challenge|25": {
"acc": 0.3609215017064846,
"acc_stderr": 0.014034761386175458,
"acc_norm": 0.40273037542662116,
"acc_norm_stderr": 0.014332236306790147
},
"harness|hellaswag|10": {
"acc": 0.5070703047201752,
"acc_stderr": 0.004989282516055394,
"acc_norm": 0.68123879705238,
"acc_norm_stderr": 0.004650438781745311
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.04060127035236397,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.04060127035236397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.43018867924528303,
"acc_stderr": 0.030471445867183235,
"acc_norm": 0.43018867924528303,
"acc_norm_stderr": 0.030471445867183235
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.375,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206824,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206824
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33191489361702126,
"acc_stderr": 0.03078373675774564,
"acc_norm": 0.33191489361702126,
"acc_norm_stderr": 0.03078373675774564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.023135287974325635,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.023135287974325635
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36774193548387096,
"acc_stderr": 0.02743086657997347,
"acc_norm": 0.36774193548387096,
"acc_norm_stderr": 0.02743086657997347
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.032826493853041504,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.032826493853041504
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.03902551007374448,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.03902551007374448
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.035402943770953675,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.035402943770953675
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5233160621761658,
"acc_stderr": 0.03604513672442202,
"acc_norm": 0.5233160621761658,
"acc_norm_stderr": 0.03604513672442202
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.023400928918310495,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.023400928918310495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.029597329730978103,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.029597329730978103
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.43486238532110094,
"acc_stderr": 0.021254631465609273,
"acc_norm": 0.43486238532110094,
"acc_norm_stderr": 0.021254631465609273
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.032036140846700596,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.032036140846700596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.03495624522015474,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.03495624522015474
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4978902953586498,
"acc_stderr": 0.032546938018020076,
"acc_norm": 0.4978902953586498,
"acc_norm_stderr": 0.032546938018020076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.47085201793721976,
"acc_stderr": 0.03350073248773404,
"acc_norm": 0.47085201793721976,
"acc_norm_stderr": 0.03350073248773404
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.45038167938931295,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.45038167938931295,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5371900826446281,
"acc_stderr": 0.04551711196104218,
"acc_norm": 0.5371900826446281,
"acc_norm_stderr": 0.04551711196104218
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.04792898170907062,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.04792898170907062
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3803680981595092,
"acc_stderr": 0.03814269893261837,
"acc_norm": 0.3803680981595092,
"acc_norm_stderr": 0.03814269893261837
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.3592233009708738,
"acc_stderr": 0.04750458399041693,
"acc_norm": 0.3592233009708738,
"acc_norm_stderr": 0.04750458399041693
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5170940170940171,
"acc_stderr": 0.032736940493481824,
"acc_norm": 0.5170940170940171,
"acc_norm_stderr": 0.032736940493481824
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5070242656449553,
"acc_stderr": 0.017878199003432217,
"acc_norm": 0.5070242656449553,
"acc_norm_stderr": 0.017878199003432217
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.40173410404624277,
"acc_stderr": 0.02639410417764363,
"acc_norm": 0.40173410404624277,
"acc_norm_stderr": 0.02639410417764363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2860335195530726,
"acc_stderr": 0.015113972129062138,
"acc_norm": 0.2860335195530726,
"acc_norm_stderr": 0.015113972129062138
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4150326797385621,
"acc_stderr": 0.0282135041778241,
"acc_norm": 0.4150326797385621,
"acc_norm_stderr": 0.0282135041778241
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3954983922829582,
"acc_stderr": 0.027770918531427838,
"acc_norm": 0.3954983922829582,
"acc_norm_stderr": 0.027770918531427838
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4104938271604938,
"acc_stderr": 0.027371350925124764,
"acc_norm": 0.4104938271604938,
"acc_norm_stderr": 0.027371350925124764
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3262411347517731,
"acc_stderr": 0.027968453043563168,
"acc_norm": 0.3262411347517731,
"acc_norm_stderr": 0.027968453043563168
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32985658409387225,
"acc_stderr": 0.012008129938540472,
"acc_norm": 0.32985658409387225,
"acc_norm_stderr": 0.012008129938540472
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40441176470588236,
"acc_stderr": 0.029812630701569736,
"acc_norm": 0.40441176470588236,
"acc_norm_stderr": 0.029812630701569736
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.01955964680921593,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.01955964680921593
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.43636363636363634,
"acc_stderr": 0.04750185058907297,
"acc_norm": 0.43636363636363634,
"acc_norm_stderr": 0.04750185058907297
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49387755102040815,
"acc_stderr": 0.03200682020163907,
"acc_norm": 0.49387755102040815,
"acc_norm_stderr": 0.03200682020163907
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5024875621890548,
"acc_stderr": 0.03535490150137289,
"acc_norm": 0.5024875621890548,
"acc_norm_stderr": 0.03535490150137289
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.03809973084540217,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.03809973084540217
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5146198830409356,
"acc_stderr": 0.03833185275213025,
"acc_norm": 0.5146198830409356,
"acc_norm_stderr": 0.03833185275213025
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766367,
"mc2": 0.4196186456267839,
"mc2_stderr": 0.01433169483869778
},
"harness|winogrande|5": {
"acc": 0.6495659037095501,
"acc_stderr": 0.013409047676670187
},
"harness|gsm8k|5": {
"acc": 0.04927975739196361,
"acc_stderr": 0.005962150655812477
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/VALUE_wnli_been_done | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 2794
num_examples: 12
- name: test
num_bytes: 16167
num_examples: 57
- name: train
num_bytes: 29881
num_examples: 129
download_size: 24177
dataset_size: 48842
---
# Dataset Card for "VALUE_wnli_been_done"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EleutherAI/CEBaB | ---
license: cc-by-4.0
dataset_info:
features:
- name: original_id
dtype: int32
- name: edit_goal
dtype: string
- name: edit_type
dtype: string
- name: text
dtype: string
- name: food
dtype: string
- name: ambiance
dtype: string
- name: service
dtype: string
- name: noise
dtype: string
- name: counterfactual
dtype: bool
- name: rating
dtype: int64
splits:
- name: validation
num_bytes: 306529
num_examples: 1673
- name: test
num_bytes: 309751
num_examples: 1689
- name: train
num_bytes: 2282439
num_examples: 11728
download_size: 628886
dataset_size: 2898719
task_categories:
- text-classification
language:
- en
---
# Dataset Card for "CEBaB"
This is a lightly cleaned and simplified version of the CEBaB counterfactual restaurant review dataset from [this paper](https://arxiv.org/abs/2205.14140).
The most important difference from the original dataset is that the `rating` column corresponds to the _median_ rating provided by the Mechanical Turkers,
rather than the majority rating. These are the same whenever a majority rating exists, but when there is no majority rating (e.g. because there were two 1s,
two 2s, and one 3), the original dataset used a `"no majority"` placeholder whereas we are able to provide an aggregate rating for all reviews.
The exact code used to process the original dataset is provided below:
```py
from ast import literal_eval
from datasets import DatasetDict, Value, load_dataset
def compute_median(x: str):
"""Compute the median rating given a multiset of ratings."""
# Decode the dictionary from string format
dist = literal_eval(x)
# Should be a dictionary whose keys are string-encoded integer ratings
# and whose values are the number of times that the rating was observed
assert isinstance(dist, dict)
assert sum(dist.values()) % 2 == 1, "Number of ratings should be odd"
ratings = []
for rating, count in dist.items():
ratings.extend([int(rating)] * count)
ratings.sort()
return ratings[len(ratings) // 2]
cebab = load_dataset('CEBaB/CEBaB')
assert isinstance(cebab, DatasetDict)
# Remove redundant splits
cebab['train'] = cebab.pop('train_inclusive')
del cebab['train_exclusive']
del cebab['train_observational']
cebab = cebab.cast_column(
'original_id', Value('int32')
).map(
lambda x: {
# New column with inverted label for counterfactuals
'counterfactual': not x['is_original'],
# Reduce the rating multiset into a single median rating
'rating': compute_median(x['review_label_distribution'])
}
).map(
# Replace the empty string and 'None' with Apache Arrow nulls
lambda x: {
k: v if v not in ('', 'no majority', 'None') else None
for k, v in x.items()
}
)
# Sanity check that all the splits have the same columns
cols = next(iter(cebab.values())).column_names
assert all(split.column_names == cols for split in cebab.values())
# Clean up the names a bit
cebab = cebab.rename_columns({
col: col.removesuffix('_majority').removesuffix('_aspect')
for col in cols if col.endswith('_majority')
}).rename_column(
'description', 'text'
)
# Drop the unimportant columns
cebab = cebab.remove_columns([
col for col in cols if col.endswith('_distribution') or col.endswith('_workers')
] + [
'edit_id', 'edit_worker', 'id', 'is_original', 'opentable_metadata', 'review'
]).sort([
# Make sure counterfactual reviews come immediately after each original review
'original_id', 'counterfactual'
])
``` |
Ruramai/zimbabwe_history_heritage | ---
license: openrail
---
|
kms7530/koalphaca-orca-for-solar | ---
dataset_info:
features:
- name: formated_inst
dtype: string
splits:
- name: train
num_bytes: 44028320.0
num_examples: 33248
download_size: 23353925
dataset_size: 44028320.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/kodama_miyako_yagatekimininaru | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kodama Miyako
This is the dataset of Kodama Miyako, containing 36 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 36 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 92 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 101 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 36 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 36 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 36 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 92 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 92 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 67 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 101 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 101 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
Tamazight-NLP/NLLB-Seed_Tamasheq-Latin-Script | ---
license: cc-by-sa-4.0
task_categories:
- translation
- text2text-generation
language:
- en
- taq
- ber
annotations_creators:
- expert-generated
pretty_name: No Language Left Behind Seed Data (Tamasheq (Latin script))
size_categories:
- 1K<n<10K
--- |
GEM-submissions/lewtun__this-is-a-test-name__1648137608 | ---
benchmark: gem
type: prediction
submission_name: This is a test name
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: This is a test name
|
pasindu/COCO_half | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 45222400999.672
num_examples: 282694
download_size: 9568698198
dataset_size: 45222400999.672
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Locutusque/InstructMix-V2 | ---
license: other
language:
- en
- code
task_categories:
- text-generation
- question-answering
- conversational
pretty_name: InstructMix-V2
size_categories:
- 10M<n<100M
---
**Dataset Summary:**
A new and improved verison of InstructMix that has nearly twice as many examples.
**Dataset Contents:**
The dataset contains a collection of instructional data with corresponding inputs and outputs. Each entry has an "Input" field that contains the instructional content, and an "Output" field that represents the corresponding response or completion. Here is a list of the datasets used:
- Locutusque/ColumnedChatCombined
- TokenBender/code_instructions_120k_alpaca_style
- Open-Orca/OpenOrca
- vicgalle/alpaca-gpt4
- ChristophSchuhmann/essays-with-instructions
- checkai/instruction-poems
- pubmed_qa
- BI55/MedText
- nampdn-ai/tiny-codes
- TIGER-Lab/MathInstruct
- garage-bAInd/Open-Platypus
- KnutJaegersberg/WizardLM_evol_instruct_V2_196k_instruct_format
- teknium/openhermes
- ssbuild/ultrachat
It contains two of the following columns:
- Input (string)
- Output (string)
These should hopefully be self-explanatory
**Dataset Composition:**
- Number of samples: 13,639,348
- Languages: English
**Use Cases:**
The InstructiveMix dataset is suitable for various NLP tasks, including text generation, text completion, translation, summarization, and more. It can be used to train and evaluate language models, code generation models, and other NLP-based applications.
**Dataset Creation:**
The InstructiveMix dataset was created by combining multiple existing datasets with instructional content and adding metadata to facilitate seamless integration. The content spans a diverse set of domains and was sourced from reputable datasets and public sources.
**License:**
Please ensure that you read and adhere to the licensing agreements of the datasets included in this compilation, as some may contain specific rules that must be followed. |
james-burton/news_channel_all_text | ---
dataset_info:
features:
- name: ' n_tokens_content'
dtype: string
- name: ' n_unique_tokens'
dtype: string
- name: ' n_non_stop_words'
dtype: string
- name: ' n_non_stop_unique_tokens'
dtype: string
- name: ' num_hrefs'
dtype: string
- name: ' num_self_hrefs'
dtype: string
- name: ' num_imgs'
dtype: string
- name: ' num_videos'
dtype: string
- name: ' average_token_length'
dtype: string
- name: ' num_keywords'
dtype: string
- name: ' global_subjectivity'
dtype: string
- name: ' global_sentiment_polarity'
dtype: string
- name: ' global_rate_positive_words'
dtype: string
- name: ' global_rate_negative_words'
dtype: string
- name: ' rate_positive_words'
dtype: string
- name: ' rate_negative_words'
dtype: string
- name: article_title
dtype: string
- name: channel
dtype: int64
splits:
- name: train
num_bytes: 4893096
num_examples: 17241
- name: validation
num_bytes: 863581
num_examples: 3043
- name: test
num_bytes: 1439606
num_examples: 5071
download_size: 3921037
dataset_size: 7196283
---
# Dataset Card for "news_channel_all_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChickenWing/tweet_geolocation | ---
dataset_info:
features:
- name: message
dtype: string
- name: longitude
dtype: float64
- name: latitude
dtype: float64
- name: timestamp
dtype: string
- name: place_name
dtype: string
- name: prompt
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 78511
num_examples: 200
- name: test
num_bytes: 1153122
num_examples: 5000
download_size: 0
dataset_size: 1231633
---
# Dataset Card for "tweet_geolocation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_79_1713170114 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 279664
num_examples: 756
download_size: 140234
dataset_size: 279664
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
burcusayin/pubmed_qa_labeled_fold0_source_binary_physician_acc | ---
dataset_info:
features:
- name: QUESTION
dtype: string
- name: CONTEXTS
sequence: string
- name: LABELS
sequence: string
- name: MESHES
sequence: string
- name: YEAR
dtype: string
- name: reasoning_required_pred
dtype: string
- name: reasoning_free_pred
dtype: string
- name: final_decision
dtype: string
- name: LONG_ANSWER
dtype: string
- name: physician_70
dtype: string
- name: physician_75
dtype: string
- name: physician_80
dtype: string
- name: physician_85
dtype: string
- name: physician_90
dtype: string
- name: physician_95
dtype: string
splits:
- name: test
num_bytes: 941935
num_examples: 445
download_size: 494268
dataset_size: 941935
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
Erynan/10_PM_test | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': positive
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 1109
num_examples: 10
download_size: 3025
dataset_size: 1109
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
polinaeterna/test_verifications | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': almond_butter
'1': almonds
'2': apple
'3': apricot
'4': asparagus
'5': avocado
'6': bacon
'7': bacon_and_egg_burger
'8': bagel
'9': baklava
'10': banana
'11': banana_bread
'12': barbecue_sauce
'13': beans
'14': beef
'15': beef_curry
'16': beef_mince
'17': beef_stir_fry
'18': beer
'19': beetroot
'20': biltong
'21': blackberries
'22': blueberries
'23': bok_choy
'24': bread
'25': broccoli
'26': broccolini
'27': brownie
'28': brussel_sprouts
'29': burrito
'30': butter
'31': cabbage
'32': calamari
'33': candy
'34': capsicum
'35': carrot
'36': cashews
'37': cauliflower
'38': celery
'39': cheese
'40': cheeseburger
'41': cherries
'42': chicken_breast
'43': chicken_thighs
'44': chicken_wings
'45': chilli
'46': chimichurri
'47': chocolate
'48': chocolate_cake
'49': coconut
'50': coffee
'51': coleslaw
'52': cookies
'53': coriander
'54': corn
'55': corn_chips
'56': cream
'57': croissant
'58': crumbed_chicken
'59': cucumber
'60': cupcake
'61': daikon_radish
'62': dates
'63': donuts
'64': dragonfruit
'65': eggplant
'66': eggs
'67': enoki_mushroom
'68': fennel
'69': figs
'70': french_toast
'71': fried_rice
'72': fries
'73': fruit_juice
'74': garlic
'75': garlic_bread
'76': ginger
'77': goji_berries
'78': granola
'79': grapefruit
'80': grapes
'81': green_beans
'82': green_onion
'83': guacamole
'84': guava
'85': gyoza
'86': ham
'87': honey
'88': hot_chocolate
'89': ice_coffee
'90': ice_cream
'91': iceberg_lettuce
'92': jerusalem_artichoke
'93': kale
'94': karaage_chicken
'95': kimchi
'96': kiwi_fruit
'97': lamb_chops
'98': leek
'99': lemon
'100': lentils
'101': lettuce
'102': lime
'103': mandarin
'104': mango
'105': maple_syrup
'106': mashed_potato
'107': mayonnaise
'108': milk
'109': miso_soup
'110': mushrooms
'111': nectarines
'112': noodles
'113': nuts
'114': olive_oil
'115': olives
'116': omelette
'117': onion
'118': orange
'119': orange_juice
'120': oysters
'121': pain_au_chocolat
'122': pancakes
'123': papaya
'124': parsley
'125': parsnips
'126': passionfruit
'127': pasta
'128': pawpaw
'129': peach
'130': pear
'131': peas
'132': pickles
'133': pineapple
'134': pizza
'135': plum
'136': pomegranate
'137': popcorn
'138': pork_belly
'139': pork_chop
'140': pork_loins
'141': porridge
'142': potato_bake
'143': potato_chips
'144': potato_scallop
'145': potatoes
'146': prawns
'147': pumpkin
'148': radish
'149': ramen
'150': raspberries
'151': red_onion
'152': red_wine
'153': rhubarb
'154': rice
'155': roast_beef
'156': roast_pork
'157': roast_potatoes
'158': rockmelon
'159': rosemary
'160': salad
'161': salami
'162': salmon
'163': salsa
'164': salt
'165': sandwich
'166': sardines
'167': sausage_roll
'168': sausages
'169': scrambled_eggs
'170': seaweed
'171': shallots
'172': snow_peas
'173': soda
'174': soy_sauce
'175': spaghetti_bolognese
'176': spinach
'177': sports_drink
'178': squash
'179': starfruit
'180': steak
'181': strawberries
'182': sushi
'183': sweet_potato
'184': tacos
'185': tamarillo
'186': taro
'187': tea
'188': toast
'189': tofu
'190': tomato
'191': tomato_chutney
'192': tomato_sauce
'193': turnip
'194': watermelon
'195': white_onion
'196': white_wine
'197': yoghurt
'198': zucchini
splits:
- name: train
num_bytes: 2973286
num_examples: 1974
download_size: 6202707
dataset_size: 2973286
---
|
open-llm-leaderboard/details_antiven0m__brugle-rp | ---
pretty_name: Evaluation run of antiven0m/brugle-rp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [antiven0m/brugle-rp](https://huggingface.co/antiven0m/brugle-rp) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_antiven0m__brugle-rp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-22T02:19:10.123124](https://huggingface.co/datasets/open-llm-leaderboard/details_antiven0m__brugle-rp/blob/main/results_2024-01-22T02-19-10.123124.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n\
\ \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n\
\ \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \
\ \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n\
\ },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n\
\ \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n\
\ \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n\
\ \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n\
\ },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n\
\ \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n\
\ \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n\
\ \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n\
\ \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n\
\ \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n\
\ \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n\
\ \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n\
\ \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"\
acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n\
\ \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"\
acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n\
\ \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n\
\ }\n}\n```"
repo_url: https://huggingface.co/antiven0m/brugle-rp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|arc:challenge|25_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|gsm8k|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hellaswag|10_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T02-19-10.123124.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T02-19-10.123124.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- '**/details_harness|winogrande|5_2024-01-22T02-19-10.123124.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-22T02-19-10.123124.parquet'
- config_name: results
data_files:
- split: 2024_01_22T02_19_10.123124
path:
- results_2024-01-22T02-19-10.123124.parquet
- split: latest
path:
- results_2024-01-22T02-19-10.123124.parquet
---
# Dataset Card for Evaluation run of antiven0m/brugle-rp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [antiven0m/brugle-rp](https://huggingface.co/antiven0m/brugle-rp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_antiven0m__brugle-rp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T02:19:10.123124](https://huggingface.co/datasets/open-llm-leaderboard/details_antiven0m__brugle-rp/blob/main/results_2024-01-22T02-19-10.123124.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tyzhu/lmind_hotpot_train8000_eval7405_v1_reciteonly_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: train_ic_qa
path: data/train_ic_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: eval_ic_qa
path: data/eval_ic_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
splits:
- name: train_qa
num_bytes: 1380987
num_examples: 8000
- name: train_recite_qa
num_bytes: 8547861
num_examples: 8000
- name: train_ic_qa
num_bytes: 8539861
num_examples: 8000
- name: eval_qa
num_bytes: 1201450
num_examples: 7405
- name: eval_recite_qa
num_bytes: 7941487
num_examples: 7405
- name: eval_ic_qa
num_bytes: 7934082
num_examples: 7405
- name: all_docs
num_bytes: 12508009
num_examples: 26854
- name: all_docs_eval
num_bytes: 12506219
num_examples: 26854
- name: train
num_bytes: 8547861
num_examples: 8000
- name: validation
num_bytes: 7941487
num_examples: 7405
download_size: 0
dataset_size: 77049304
---
# Dataset Card for "lmind_hotpot_train8000_eval7405_v1_reciteonly_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
saibo/bookcorpus_compact_1024_shard9_of_10_meta | ---
dataset_info:
features:
- name: text
dtype: string
- name: concept_with_offset
dtype: string
- name: cid_arrangement
sequence: int32
- name: schema_lengths
sequence: int64
- name: topic_entity_mask
sequence: int64
- name: text_lengths
sequence: int64
splits:
- name: train
num_bytes: 7675706871
num_examples: 61605
download_size: 1683788529
dataset_size: 7675706871
---
# Dataset Card for "bookcorpus_compact_1024_shard9_of_10_meta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.