datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
HenriCastro/th_dt_01 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 445377
num_examples: 242
download_size: 224741
dataset_size: 445377
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "th_dt_01"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Severian__Nexus-IKM-Hermes-2-Pro-Mistral-7B | ---
pretty_name: Evaluation run of Severian/Nexus-IKM-Hermes-2-Pro-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Severian/Nexus-IKM-Hermes-2-Pro-Mistral-7B](https://huggingface.co/Severian/Nexus-IKM-Hermes-2-Pro-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__Nexus-IKM-Hermes-2-Pro-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-14T12:32:34.011347](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__Nexus-IKM-Hermes-2-Pro-Mistral-7B/blob/main/results_2024-03-14T12-32-34.011347.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2519731441987328,\n\
\ \"acc_stderr\": 0.030603235407632018,\n \"acc_norm\": 0.2529875911171585,\n\
\ \"acc_norm_stderr\": 0.031418977564067176,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.01486975501587109,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\"\
: 0.2380546075085324,\n \"acc_stderr\": 0.01244577002802621,\n \"\
acc_norm\": 0.29266211604095566,\n \"acc_norm_stderr\": 0.013295916103619406\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.270264887472615,\n\
\ \"acc_stderr\": 0.004431889783633817,\n \"acc_norm\": 0.2932682732523402,\n\
\ \"acc_norm_stderr\": 0.004543299338935422\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n\
\ \"acc_stderr\": 0.03885004245800254,\n \"acc_norm\": 0.2814814814814815,\n\
\ \"acc_norm_stderr\": 0.03885004245800254\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.03761070869867479,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.03761070869867479\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.035146974678623884,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.035146974678623884\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.11,\n \"acc_stderr\": 0.031446603773522035,\n \
\ \"acc_norm\": 0.11,\n \"acc_norm_stderr\": 0.031446603773522035\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.14,\n \"acc_stderr\": 0.034873508801977725,\n \
\ \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.034873508801977725\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.032424147574830996,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.032424147574830996\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714534,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.22127659574468084,\n \"acc_stderr\": 0.02713634960242406,\n\
\ \"acc_norm\": 0.22127659574468084,\n \"acc_norm_stderr\": 0.02713634960242406\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003337,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003337\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873502,\n \"\
acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873502\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604672,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604672\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3032258064516129,\n\
\ \"acc_stderr\": 0.026148685930671742,\n \"acc_norm\": 0.3032258064516129,\n\
\ \"acc_norm_stderr\": 0.026148685930671742\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2019704433497537,\n \"acc_stderr\": 0.028247350122180267,\n\
\ \"acc_norm\": 0.2019704433497537,\n \"acc_norm_stderr\": 0.028247350122180267\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3151515151515151,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.3151515151515151,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.031195840877700293,\n\
\ \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.031195840877700293\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2641025641025641,\n \"acc_stderr\": 0.02235219373745327,\n \
\ \"acc_norm\": 0.2641025641025641,\n \"acc_norm_stderr\": 0.02235219373745327\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844054,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844054\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958945,\n\
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958945\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2119205298013245,\n \"acc_stderr\": 0.03336767086567977,\n \"\
acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.03336767086567977\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24403669724770644,\n \"acc_stderr\": 0.018415286351416416,\n \"\
acc_norm\": 0.24403669724770644,\n \"acc_norm_stderr\": 0.018415286351416416\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25462962962962965,\n \"acc_stderr\": 0.029711275860005344,\n \"\
acc_norm\": 0.25462962962962965,\n \"acc_norm_stderr\": 0.029711275860005344\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.03096451792692341,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.03096451792692341\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.029178682304842562,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.029178682304842562\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.23766816143497757,\n\
\ \"acc_stderr\": 0.028568079464714274,\n \"acc_norm\": 0.23766816143497757,\n\
\ \"acc_norm_stderr\": 0.028568079464714274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212095,\n \"\
acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212095\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03755265865037183,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03755265865037183\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23504273504273504,\n\
\ \"acc_stderr\": 0.027778835904935434,\n \"acc_norm\": 0.23504273504273504,\n\
\ \"acc_norm_stderr\": 0.027778835904935434\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n\
\ \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n\
\ \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.0218552552634218,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.0218552552634218\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26145251396648045,\n\
\ \"acc_stderr\": 0.014696599650364555,\n \"acc_norm\": 0.26145251396648045,\n\
\ \"acc_norm_stderr\": 0.014696599650364555\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.024170840879341005,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.024170840879341005\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.26366559485530544,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2633637548891786,\n\
\ \"acc_stderr\": 0.01124950640360528,\n \"acc_norm\": 0.2633637548891786,\n\
\ \"acc_norm_stderr\": 0.01124950640360528\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22794117647058823,\n \"acc_stderr\": 0.025483081468029804,\n\
\ \"acc_norm\": 0.22794117647058823,\n \"acc_norm_stderr\": 0.025483081468029804\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2761437908496732,\n \"acc_stderr\": 0.018087276935663137,\n \
\ \"acc_norm\": 0.2761437908496732,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724136,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724136\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.20408163265306123,\n \"acc_stderr\": 0.025801283475090506,\n\
\ \"acc_norm\": 0.20408163265306123,\n \"acc_norm_stderr\": 0.025801283475090506\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.31343283582089554,\n\
\ \"acc_stderr\": 0.03280188205348643,\n \"acc_norm\": 0.31343283582089554,\n\
\ \"acc_norm_stderr\": 0.03280188205348643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.18674698795180722,\n\
\ \"acc_stderr\": 0.0303387491445006,\n \"acc_norm\": 0.18674698795180722,\n\
\ \"acc_norm_stderr\": 0.0303387491445006\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.01486975501587109,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5217048145224941,\n\
\ \"acc_stderr\": 0.014039239216484633\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Severian/Nexus-IKM-Hermes-2-Pro-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|arc:challenge|25_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|gsm8k|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hellaswag|10_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T12-32-34.011347.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T12-32-34.011347.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- '**/details_harness|winogrande|5_2024-03-14T12-32-34.011347.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-14T12-32-34.011347.parquet'
- config_name: results
data_files:
- split: 2024_03_14T12_32_34.011347
path:
- results_2024-03-14T12-32-34.011347.parquet
- split: latest
path:
- results_2024-03-14T12-32-34.011347.parquet
---
# Dataset Card for Evaluation run of Severian/Nexus-IKM-Hermes-2-Pro-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Severian/Nexus-IKM-Hermes-2-Pro-Mistral-7B](https://huggingface.co/Severian/Nexus-IKM-Hermes-2-Pro-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Severian__Nexus-IKM-Hermes-2-Pro-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-14T12:32:34.011347](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__Nexus-IKM-Hermes-2-Pro-Mistral-7B/blob/main/results_2024-03-14T12-32-34.011347.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2519731441987328,
"acc_stderr": 0.030603235407632018,
"acc_norm": 0.2529875911171585,
"acc_norm_stderr": 0.031418977564067176,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.01486975501587109,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.2380546075085324,
"acc_stderr": 0.01244577002802621,
"acc_norm": 0.29266211604095566,
"acc_norm_stderr": 0.013295916103619406
},
"harness|hellaswag|10": {
"acc": 0.270264887472615,
"acc_stderr": 0.004431889783633817,
"acc_norm": 0.2932682732523402,
"acc_norm_stderr": 0.004543299338935422
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.03885004245800254,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.03885004245800254
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.035146974678623884,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.035146974678623884
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.11,
"acc_stderr": 0.031446603773522035,
"acc_norm": 0.11,
"acc_norm_stderr": 0.031446603773522035
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.14,
"acc_stderr": 0.034873508801977725,
"acc_norm": 0.14,
"acc_norm_stderr": 0.034873508801977725
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.032424147574830996,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.032424147574830996
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714534,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.22127659574468084,
"acc_stderr": 0.02713634960242406,
"acc_norm": 0.22127659574468084,
"acc_norm_stderr": 0.02713634960242406
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003337,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003337
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873502,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873502
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604672,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604672
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3032258064516129,
"acc_stderr": 0.026148685930671742,
"acc_norm": 0.3032258064516129,
"acc_norm_stderr": 0.026148685930671742
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2019704433497537,
"acc_stderr": 0.028247350122180267,
"acc_norm": 0.2019704433497537,
"acc_norm_stderr": 0.028247350122180267
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3151515151515151,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.3151515151515151,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.031195840877700293,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.031195840877700293
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2641025641025641,
"acc_stderr": 0.02235219373745327,
"acc_norm": 0.2641025641025641,
"acc_norm_stderr": 0.02235219373745327
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844054,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844054
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958945,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958945
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.03336767086567977,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.03336767086567977
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24403669724770644,
"acc_stderr": 0.018415286351416416,
"acc_norm": 0.24403669724770644,
"acc_norm_stderr": 0.018415286351416416
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25462962962962965,
"acc_stderr": 0.029711275860005344,
"acc_norm": 0.25462962962962965,
"acc_norm_stderr": 0.029711275860005344
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.03096451792692341,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.03096451792692341
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.029178682304842562,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.029178682304842562
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.23766816143497757,
"acc_stderr": 0.028568079464714274,
"acc_norm": 0.23766816143497757,
"acc_norm_stderr": 0.028568079464714274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.04294340845212095,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.04294340845212095
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03755265865037183,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03755265865037183
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23504273504273504,
"acc_stderr": 0.027778835904935434,
"acc_norm": 0.23504273504273504,
"acc_norm_stderr": 0.027778835904935434
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.0218552552634218,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.0218552552634218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26145251396648045,
"acc_stderr": 0.014696599650364555,
"acc_norm": 0.26145251396648045,
"acc_norm_stderr": 0.014696599650364555
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.024170840879341005,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.024170840879341005
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2633637548891786,
"acc_stderr": 0.01124950640360528,
"acc_norm": 0.2633637548891786,
"acc_norm_stderr": 0.01124950640360528
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22794117647058823,
"acc_stderr": 0.025483081468029804,
"acc_norm": 0.22794117647058823,
"acc_norm_stderr": 0.025483081468029804
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2761437908496732,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.2761437908496732,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724136,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724136
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20408163265306123,
"acc_stderr": 0.025801283475090506,
"acc_norm": 0.20408163265306123,
"acc_norm_stderr": 0.025801283475090506
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.31343283582089554,
"acc_stderr": 0.03280188205348643,
"acc_norm": 0.31343283582089554,
"acc_norm_stderr": 0.03280188205348643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-virology|5": {
"acc": 0.18674698795180722,
"acc_stderr": 0.0303387491445006,
"acc_norm": 0.18674698795180722,
"acc_norm_stderr": 0.0303387491445006
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.01486975501587109,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.5217048145224941,
"acc_stderr": 0.014039239216484633
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
arieg/cluster_cls_large_8 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '000002'
'1': '000005'
'2': '000010'
'3': '000140'
'4': '000141'
'5': 000148
'6': 000182
'7': 000190
'8': 000193
'9': 000194
'10': 000197
'11': '000200'
'12': '000203'
'13': '000204'
'14': '000207'
'15': '000210'
'16': '000211'
'17': '000212'
'18': '000213'
'19': '000255'
'20': '000256'
'21': 000368
'22': '000424'
'23': 000459
'24': '000534'
'25': '000540'
'26': '000546'
'27': '000574'
'28': '000602'
'29': '000615'
'30': '000620'
'31': '000621'
'32': '000625'
'33': '000666'
'34': '000667'
'35': '000676'
'36': 000690
'37': 000694
'38': 000695
'39': '000704'
'40': '000705'
'41': '000706'
'42': '000707'
'43': 000708
'44': 000709
'45': '000714'
'46': '000715'
'47': '000716'
'48': 000718
'49': '000777'
'50': 000814
'51': 000821
'52': 000822
'53': 000825
'54': 000853
'55': 000890
'56': 000892
'57': 000897
'58': 000993
'59': 000995
'60': 000997
'61': 000998
'62': 001039
'63': '001040'
'64': '001066'
'65': 001069
'66': '001073'
'67': '001075'
'68': 001082
'69': 001083
'70': 001087
'71': '001102'
'72': 001193
'73': 001195
'74': 001196
'75': 001197
'76': 001249
'77': 001259
'78': '001270'
'79': '001276'
splits:
- name: train
num_bytes: 33869792.0
num_examples: 640
download_size: 33879741
dataset_size: 33869792.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
NLPinas/ph_en_text_detoxed | ---
license: apache-2.0
language:
- tl
- en
size_categories:
- 1M<n<10M
task_categories:
- text-generation
- question-answering
---
PhEnText Detoxed is a large-scale and multi-domain lexical data written in Philippine English and Taglish text. The news articles, religious articles and court decisions collated by the [original researchers](https://ieeexplore.ieee.org/abstract/document/9923429) were filtered for toxicity and special characters were further preprocessed. This dataset has been configured to easily fine-tune LLaMA-based models (Alpaca, Guanaco, Vicuna, LLaMA 2, etc.) In total, this dataset contains 6.29 million rows of training data and 2.7 million rows of testing data.
## Sources
According to Canon et al. (2022), here is the original breakdown of the dataset sources:
| Source | Website | Year | Number of Documents |
|-------------------|------------------------|--------------|---------------------|
| Online news (Philippine Daily Inquirer) | inquirer.net | 2009-2021 | 834,630 |
| Online news (Manila Bulletin) | mb.com.ph | 2018-2021 | 248,408 |
| Jurisprudence | lawphil.net | 1901-2021 | 59,905 |
| Old digital periodicals | repository.mainlib.upd.edu.ph | 1904-1981 | 20,999 |
| Religious texts | cbcponline.net | 2009-2022 | 2,281 |
| Laws and Issuances| officialgazette.gov.ph | 1906-2016 | 30,215 |
## Ethical Considerations
Before and after training/fine-tuning a model on this dataset, it is important to take note of the following:
1. **Fairness and Bias:** The model's responses may reflect biases present in the training data. Be aware of potential biases and make an effort to evaluate responses critically and fairly.
2. **Transparency:** The model operates as a predictive text generator based on patterns learned from the training data.
3. **User Responsibility:** Users should take responsibility for their own decisions and not solely rely on the information provided by the model. Consult with the appropriate professionals or reliable sources for specific advice or recommendations.
4. **NSFW Content:** The data has already been detoxified, however it may still contain sensitive topics including violence, gore, and sexual content. If you plan to further refine your model for safe/aligned usage, you are highly encouraged to implement guardrails along with it.
5. **Timeliness** The data's cutoff date is December 2021. The data must not be used to generate content that heavily relies on events after the cutoff date.
## References
```bibtext
@INPROCEEDINGS{9923429,
author={Canon, Mary Joy P. and Sy, Christian Y. and Palaoag, Thelma D. and Roxas, Rachel Edita O. and Maceda, Lany L.},
booktitle={2022 International Conference on Advanced Computer Science and Information Systems (ICACSIS)},
title={Language Resource Construction of Multi-Domain Philippine English Text for Pre-training Objective},
year={2022},
volume={},
number={},
pages={149-154},
doi={10.1109/ICACSIS56558.2022.9923429}}
@misc{PhEnText Detoxed,
author = {Catapang, Jasper Kyle and Peramo, Elmer},
title = {PhEnText Detoxed},
year = {2023},
publisher = {Hugging Face},
journal = {Hugging Face Hub},
howpublished = {\url{https://huggingface.co/datasets/NLPinas/ph_en_text_detoxed}}
}
``` |
ChanceFocus/flare-finqa | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 27056024
num_examples: 6251
- name: valid
num_bytes: 3764872
num_examples: 883
- name: test
num_bytes: 4846110
num_examples: 1147
download_size: 0
dataset_size: 35667006
---
# Dataset Card for "flare-finqa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Aanisha/NeonGAN_dataset | ---
license: mit
---
|
elyza/ELYZA-tasks-100 | ---
task_categories:
- text2text-generation
language:
- ja
size_categories:
- n<1K
license: cc-by-sa-4.0
---
# ELYZA-tasks-100: 日本語instructionモデル評価データセット

## Data Description
本データセットはinstruction-tuningを行ったモデルの評価用データセットです。詳細は [リリースのnote記事](https://note.com/elyza/n/na405acaca130) を参照してください。
特徴:
- 複雑な指示・タスクを含む100件の日本語データです。
- 役に立つAIアシスタントとして、丁寧な出力が求められます。
- 全てのデータに対して評価観点がアノテーションされており、評価の揺らぎを抑えることが期待されます。
具体的には以下のようなタスクを含みます。
- 要約を修正し、修正箇所を説明するタスク
- 具体的なエピソードから抽象的な教訓を述べるタスク
- ユーザーの意図を汲み役に立つAIアシスタントとして振る舞うタスク
- 場合分けを必要とする複雑な算数のタスク
- 未知の言語からパターンを抽出し日本語訳する高度な推論を必要とするタスク
- 複数の指示を踏まえた上でyoutubeの対話を生成するタスク
- 架空の生き物や熟語に関する生成・大喜利などの想像力が求められるタスク
## Usage
datasetsライブラリから利用が可能です。
```py
>>> from datasets import load_dataset
>>> ds = load_dataset("elyza/ELYZA-tasks-100")
>>> ds
DatasetDict({
test: Dataset({
features: ["input", "output", "eval_aspect"],
num_rows: 100
})
})
>>> ds["test"][0]
{
'input': '仕事の熱意を取り戻すためのアイデアを5つ挙げてください。',
'output': '1. 自分の仕事に対する興味を再発見するために、新しい技能や知識を学ぶこと。\n2. カレッジやセミナーなどで講演を聴くことで、仕事に対する新しいアイデアや視点を得ること。\n3. 仕事に対してストレスを感じている場合は、ストレスマネジメントのテクニックを学ぶこと。\n4. 仕事以外の楽しいことをすることで、ストレスを発散すること。\n5. 仕事に対して自己評価をすることで、自分がどのように進化しているのかを知ること。',
'eval_aspect': '- 熱意を取り戻すのではなく、仕事の効率化・スキルアップのような文脈になっていたら1点減点\n- 出したアイデアが5つより多い、少ない場合は1点減点\n- 5つのアイデアのうち、内容が重複しているものがあれば1点減点\n\n'
}
```
## Baseline Evaluation
本データセットは手動/自動, 絶対/相対 評価のいずれの評価形式でも利用していただくことができますが、今回我々はベースラインモデルの評価として、5段階の絶対評価を手動で行いました。
### 評価手順
1. [こちらの推論スクリプト](https://huggingface.co/datasets/elyza/ELYZA-tasks-100/tree/main/baseline/scripts)のようにベースラインとなるモデルでの推論を行い、[baseline/preds](https://huggingface.co/datasets/elyza/ELYZA-tasks-100/tree/main/baseline/preds)以下に推論結果を格納しました。
- 基本的にgenerate時のパラメータはREADMEなどに記載されているデフォルト値を用いました。
2. [shuffle_for_humaneval.py](https://huggingface.co/datasets/elyza/ELYZA-tasks-100/blob/main/baseline/humaneval/shuffle_for_humaneval.py)を用いて匿名化されたモデルの推論結果 [shuffled_preds.csv](https://huggingface.co/datasets/elyza/ELYZA-tasks-100/blob/main/baseline/humaneval/shuffled_preds.csv) と匿名化を復元するための対応表 [uuids.csv](https://huggingface.co/datasets/elyza/ELYZA-tasks-100/blob/main/baseline/humaneval/uuids.csv) を作成しました。
3. [shuffled_preds.csv](https://huggingface.co/datasets/elyza/ELYZA-tasks-100/blob/main/baseline/humaneval/shuffled_preds.csv) を Googleスプレッドシートにアップロードし、[評価ガイドライン](https://huggingface.co/datasets/elyza/ELYZA-tasks-100/blob/main/baseline/humaneval/guideline.md) に従って、各データ3人で人手評価を行いました。
4. スプレッドシートでの評価結果を[annotated_shuffled_preds.xlsx](https://huggingface.co/datasets/elyza/ELYZA-tasks-100/blob/main/baseline/humaneval/annotated_shuffled_preds.xlsx)としてダウンロードし、 [deshuffle_annotations.py](https://huggingface.co/datasets/elyza/ELYZA-tasks-100/blob/main/baseline/humaneval/deshuffle_annotations.py) を利用し、匿名化された評価結果を復号して[annotated_deshuffled_preds.csv](https://huggingface.co/datasets/elyza/ELYZA-tasks-100/blob/main/baseline/humaneval/annotated_deshuffled_preds.csv) として保存しました。
5. 最後にGoogleスプレッドシートに[評価結果シート](https://docs.google.com/spreadsheets/d/1mtoy4QAqDPk2f_B0vDogFoOrbA5G42DBEEHdqM4VmDI/edit#gid=1023787356)にアップロードして可視化しました。
### 評価結果
- スコアについては、[リリースのnote記事](https://note.com/elyza/n/na405acaca130) を参照してください。
- [評価結果シート](https://docs.google.com/spreadsheets/d/1mtoy4QAqDPk2f_B0vDogFoOrbA5G42DBEEHdqM4VmDI/edit#gid=1023787356):
- 全ての入出力と評価を公開しています。スコアだけでは分からないモデルの傾向を知ることができます。
### 評価手法の妥当性について
[zennの技術ブログ](https://zenn.dev/elyza/articles/5e7d9373c32a98)にて今回のベースラインの評価の詳細な分析についての記事を書きました。よければそちらもご覧ください。
## GPT4での自動評価について
こちらも[zennの技術ブログ](https://zenn.dev/elyza/articles/5e7d9373c32a98)にて実際にGPT4での評価を行う際のコードと結果を示しています。
## Developers
以下アルファベット順です。
- [Akira Sasaki](https://huggingface.co/akirasasaki)
- [Masato Hirakawa](https://huggingface.co/m-hirakawa)
- [Shintaro Horie](https://huggingface.co/e-mon)
- [Tomoaki Nakamura](https://huggingface.co/tyoyo)
## License

このデータセットは [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/deed.ja) でライセンスされています。
## How to Cite
```tex
@misc{elyzatasks100,
title={ELYZA-tasks-100: 日本語instructionモデル評価データセット},
url={https://huggingface.co/elyza/ELYZA-tasks-100},
author={Akira Sasaki and Masato Hirakawa and Shintaro Horie and Tomoaki Nakamura},
year={2023},
}
```
## Citations
```tex
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom},
year={2023},
eprint={2307.09288},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
Hasan-Mesbaul-420/Tamil_speech | ---
dataset_info:
features:
- name: path
dtype: string
- name: array
sequence: float64
- name: sampling_rate
dtype: int64
- name: Text File Path
dtype: string
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 2280662818
num_examples: 908
- name: test
num_bytes: 675141227
num_examples: 351
download_size: 3191937492
dataset_size: 2955804045
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
Notebook: https://www.kaggle.com/code/hasanmesbaulalitaher/tamil-voice-dataset-preparation/notebook |
davanstrien/label-studio-export-test | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: annotation_id
dtype: int64
- name: annotator
dtype: int64
- name: choice
dtype: string
- name: created_at
dtype: string
- name: id
dtype: int64
- name: image
dtype: image
- name: lead_time
dtype: float64
- name: updated_at
dtype: string
splits:
- name: train
num_bytes: 602087.0
num_examples: 4
download_size: 606895
dataset_size: 602087.0
---
# Dataset Card for "label-studio-export-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713182464 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 16839
num_examples: 45
download_size: 16659
dataset_size: 16839
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713182464"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
simarora/ConcurrentQA | ---
license: mit
task_categories:
- question-answering
language:
- en
---
ConcurrentQA is a textual multi-hop QA benchmark to require concurrent retrieval over multiple data-distributions (i.e. Wikipedia and email data). This dataset was constructed by researchers at Stanford and FAIR, following the data collection process and schema of HotpotQA. This benchmark can be used to study generalization in retrieval as well as privacy when reasoning across multiple privacy scopes --- i.e. public Wikipedia documents and private emails.
This dataset is for the Question-Answering task. The dataset for the Retrieval task can be found here: https://huggingface.co/datasets/simarora/ConcurrentQA-Retrieval
The corpora of documents (Wikipedia and Emails) over which a system would need to retrieve information and answer questions can be downloaded using the following commands:
```
cd ..
mkdir corpora
cd corpora
wget https://dl.fbaipublicfiles.com/concurrentqa/corpora/enron_only_corpus.json
wget https://dl.fbaipublicfiles.com/concurrentqa/corpora/combined_corpus.json
wget https://dl.fbaipublicfiles.com/concurrentqa/corpora/wiki_only_corpus.json
wget https://dl.fbaipublicfiles.com/concurrentqa/corpora/title2sent_map.json
```
The repo https://github.com/facebookresearch/concurrentqa contains model training and result analysis code.
If you find this resource useful, consider citing the paper:
```
@article{arora2023reasoning,
title={Reasoning over Public and Private Data in Retrieval-Based Systems},
author={Simran Arora and Patrick Lewis and Angela Fan and Jacob Kahn and Christopher Ré},
year={2023},
url={https://direct.mit.edu/tacl/article/doi/10.1162/tacl_a_00556/116046/Aggretriever-A-Simple-Approach-to-Aggregate},
journal={Transactions of the Association for Computational Linguistics},
}
```
Please reach out at ```simran@cs.stanford.edu``` with questions or feedback! |
vietgpt/wikipedia_en | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 21102365479
num_examples: 6623239
download_size: 12161597141
dataset_size: 21102365479
task_categories:
- text-generation
language:
- en
tags:
- LM
size_categories:
- 1M<n<10M
---
# Wikipedia
- Source: https://huggingface.co/datasets/wikipedia
- Num examples: 6,623,239
- Language: English
```python
from datasets import load_dataset
load_dataset("tdtunlp/wikipedia_en")
``` |
pat-jj/nyt10_corpus | ---
license: mit
---
|
BashitAli/Indian_history | ---
license: unknown
---
|
sravaniayyagari/aeon-dataset-empty-values-1k | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Content
dtype: string
splits:
- name: train
num_bytes: 2823323
num_examples: 1560
- name: validation
num_bytes: 323338
num_examples: 189
download_size: 401169
dataset_size: 3146661
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
jarrydmartinx/metabric2 | ---
dataset_info:
features:
- name: patient_id
dtype: int64
- name: age_at_diagnosis
dtype: float64
- name: type_of_breast_surgery
dtype: string
- name: cancer_type
dtype: string
- name: cancer_type_detailed
dtype: string
- name: cellularity
dtype: string
- name: chemotherapy
dtype: int64
- name: pam50_+_claudin-low_subtype
dtype: string
- name: cohort
dtype: float64
- name: er_status_measured_by_ihc
dtype: string
- name: er_status
dtype: string
- name: neoplasm_histologic_grade
dtype: float64
- name: her2_status_measured_by_snp6
dtype: string
- name: her2_status
dtype: string
- name: tumor_other_histologic_subtype
dtype: string
- name: hormone_therapy
dtype: int64
- name: inferred_menopausal_state
dtype: string
- name: integrative_cluster
dtype: string
- name: primary_tumor_laterality
dtype: string
- name: lymph_nodes_examined_positive
dtype: float64
- name: nottingham_prognostic_index
dtype: float64
- name: oncotree_code
dtype: string
- name: pr_status
dtype: string
- name: radio_therapy
dtype: int64
- name: 3-gene_classifier_subtype
dtype: string
- name: tumor_size
dtype: float64
- name: tumor_stage
dtype: float64
- name: death_from_cancer
dtype: string
- name: brca1
dtype: float64
- name: brca2
dtype: float64
- name: palb2
dtype: float64
- name: pten
dtype: float64
- name: tp53
dtype: float64
- name: atm
dtype: float64
- name: cdh1
dtype: float64
- name: chek2
dtype: float64
- name: nbn
dtype: float64
- name: nf1
dtype: float64
- name: stk11
dtype: float64
- name: bard1
dtype: float64
- name: mlh1
dtype: float64
- name: msh2
dtype: float64
- name: msh6
dtype: float64
- name: pms2
dtype: float64
- name: epcam
dtype: float64
- name: rad51c
dtype: float64
- name: rad51d
dtype: float64
- name: rad50
dtype: float64
- name: rb1
dtype: float64
- name: rbl1
dtype: float64
- name: rbl2
dtype: float64
- name: ccna1
dtype: float64
- name: ccnb1
dtype: float64
- name: cdk1
dtype: float64
- name: ccne1
dtype: float64
- name: cdk2
dtype: float64
- name: cdc25a
dtype: float64
- name: ccnd1
dtype: float64
- name: cdk4
dtype: float64
- name: cdk6
dtype: float64
- name: ccnd2
dtype: float64
- name: cdkn2a
dtype: float64
- name: cdkn2b
dtype: float64
- name: myc
dtype: float64
- name: cdkn1a
dtype: float64
- name: cdkn1b
dtype: float64
- name: e2f1
dtype: float64
- name: e2f2
dtype: float64
- name: e2f3
dtype: float64
- name: e2f4
dtype: float64
- name: e2f5
dtype: float64
- name: e2f6
dtype: float64
- name: e2f7
dtype: float64
- name: e2f8
dtype: float64
- name: src
dtype: float64
- name: jak1
dtype: float64
- name: jak2
dtype: float64
- name: stat1
dtype: float64
- name: stat2
dtype: float64
- name: stat3
dtype: float64
- name: stat5a
dtype: float64
- name: stat5b
dtype: float64
- name: mdm2
dtype: float64
- name: tp53bp1
dtype: float64
- name: adam10
dtype: float64
- name: adam17
dtype: float64
- name: aph1a
dtype: float64
- name: aph1b
dtype: float64
- name: arrdc1
dtype: float64
- name: cir1
dtype: float64
- name: ctbp1
dtype: float64
- name: ctbp2
dtype: float64
- name: cul1
dtype: float64
- name: dll1
dtype: float64
- name: dll3
dtype: float64
- name: dll4
dtype: float64
- name: dtx1
dtype: float64
- name: dtx2
dtype: float64
- name: dtx3
dtype: float64
- name: dtx4
dtype: float64
- name: ep300
dtype: float64
- name: fbxw7
dtype: float64
- name: hdac1
dtype: float64
- name: hdac2
dtype: float64
- name: hes1
dtype: float64
- name: hes5
dtype: float64
- name: heyl
dtype: float64
- name: itch
dtype: float64
- name: jag1
dtype: float64
- name: jag2
dtype: float64
- name: kdm5a
dtype: float64
- name: lfng
dtype: float64
- name: maml1
dtype: float64
- name: maml2
dtype: float64
- name: maml3
dtype: float64
- name: ncor2
dtype: float64
- name: ncstn
dtype: float64
- name: notch1
dtype: float64
- name: notch2
dtype: float64
- name: notch3
dtype: float64
- name: nrarp
dtype: float64
- name: numb
dtype: float64
- name: numbl
dtype: float64
- name: psen1
dtype: float64
- name: psen2
dtype: float64
- name: psenen
dtype: float64
- name: rbpj
dtype: float64
- name: rbpjl
dtype: float64
- name: rfng
dtype: float64
- name: snw1
dtype: float64
- name: spen
dtype: float64
- name: hes2
dtype: float64
- name: hes4
dtype: float64
- name: hes7
dtype: float64
- name: hey1
dtype: float64
- name: hey2
dtype: float64
- name: acvr1
dtype: float64
- name: acvr1b
dtype: float64
- name: acvr1c
dtype: float64
- name: acvr2a
dtype: float64
- name: acvr2b
dtype: float64
- name: acvrl1
dtype: float64
- name: akt1
dtype: float64
- name: akt1s1
dtype: float64
- name: akt2
dtype: float64
- name: apaf1
dtype: float64
- name: arl11
dtype: float64
- name: atr
dtype: float64
- name: aurka
dtype: float64
- name: bad
dtype: float64
- name: bcl2
dtype: float64
- name: bcl2l1
dtype: float64
- name: bmp10
dtype: float64
- name: bmp15
dtype: float64
- name: bmp2
dtype: float64
- name: bmp3
dtype: float64
- name: bmp4
dtype: float64
- name: bmp5
dtype: float64
- name: bmp6
dtype: float64
- name: bmp7
dtype: float64
- name: bmpr1a
dtype: float64
- name: bmpr1b
dtype: float64
- name: bmpr2
dtype: float64
- name: braf
dtype: float64
- name: casp10
dtype: float64
- name: casp3
dtype: float64
- name: casp6
dtype: float64
- name: casp7
dtype: float64
- name: casp8
dtype: float64
- name: casp9
dtype: float64
- name: chek1
dtype: float64
- name: csf1
dtype: float64
- name: csf1r
dtype: float64
- name: cxcl8
dtype: float64
- name: cxcr1
dtype: float64
- name: cxcr2
dtype: float64
- name: dab2
dtype: float64
- name: diras3
dtype: float64
- name: dlec1
dtype: float64
- name: dph1
dtype: float64
- name: egfr
dtype: float64
- name: eif4e
dtype: float64
- name: eif4ebp1
dtype: float64
- name: eif5a2
dtype: float64
- name: erbb2
dtype: float64
- name: erbb3
dtype: float64
- name: erbb4
dtype: float64
- name: fas
dtype: float64
- name: fgf1
dtype: float64
- name: fgfr1
dtype: float64
- name: folr1
dtype: float64
- name: folr2
dtype: float64
- name: folr3
dtype: float64
- name: foxo1
dtype: float64
- name: foxo3
dtype: float64
- name: gdf11
dtype: float64
- name: gdf2
dtype: float64
- name: gsk3b
dtype: float64
- name: hif1a
dtype: float64
- name: hla-g
dtype: float64
- name: hras
dtype: float64
- name: igf1
dtype: float64
- name: igf1r
dtype: float64
- name: inha
dtype: float64
- name: inhba
dtype: float64
- name: inhbc
dtype: float64
- name: itgav
dtype: float64
- name: itgb3
dtype: float64
- name: izumo1r
dtype: float64
- name: kdr
dtype: float64
- name: kit
dtype: float64
- name: kras
dtype: float64
- name: map2k1
dtype: float64
- name: map2k2
dtype: float64
- name: map2k3
dtype: float64
- name: map2k4
dtype: float64
- name: map2k5
dtype: float64
- name: map3k1
dtype: float64
- name: map3k3
dtype: float64
- name: map3k4
dtype: float64
- name: map3k5
dtype: float64
- name: mapk1
dtype: float64
- name: mapk12
dtype: float64
- name: mapk14
dtype: float64
- name: mapk3
dtype: float64
- name: mapk4
dtype: float64
- name: mapk6
dtype: float64
- name: mapk7
dtype: float64
- name: mapk8
dtype: float64
- name: mapk9
dtype: float64
- name: mdc1
dtype: float64
- name: mlst8
dtype: float64
- name: mmp1
dtype: float64
- name: mmp10
dtype: float64
- name: mmp11
dtype: float64
- name: mmp12
dtype: float64
- name: mmp13
dtype: float64
- name: mmp14
dtype: float64
- name: mmp15
dtype: float64
- name: mmp16
dtype: float64
- name: mmp17
dtype: float64
- name: mmp19
dtype: float64
- name: mmp2
dtype: float64
- name: mmp21
dtype: float64
- name: mmp23b
dtype: float64
- name: mmp24
dtype: float64
- name: mmp25
dtype: float64
- name: mmp26
dtype: float64
- name: mmp27
dtype: float64
- name: mmp28
dtype: float64
- name: mmp3
dtype: float64
- name: mmp7
dtype: float64
- name: mmp9
dtype: float64
- name: mtor
dtype: float64
- name: nfkb1
dtype: float64
- name: nfkb2
dtype: float64
- name: opcml
dtype: float64
- name: pdgfa
dtype: float64
- name: pdgfb
dtype: float64
- name: pdgfra
dtype: float64
- name: pdgfrb
dtype: float64
- name: pdpk1
dtype: float64
- name: peg3
dtype: float64
- name: pik3ca
dtype: float64
- name: pik3r1
dtype: float64
- name: pik3r2
dtype: float64
- name: plagl1
dtype: float64
- name: ptk2
dtype: float64
- name: rab25
dtype: float64
- name: rad51
dtype: float64
- name: raf1
dtype: float64
- name: rassf1
dtype: float64
- name: rheb
dtype: float64
- name: rictor
dtype: float64
- name: rps6
dtype: float64
- name: rps6ka1
dtype: float64
- name: rps6ka2
dtype: float64
- name: rps6kb1
dtype: float64
- name: rps6kb2
dtype: float64
- name: rptor
dtype: float64
- name: slc19a1
dtype: float64
- name: smad1
dtype: float64
- name: smad2
dtype: float64
- name: smad3
dtype: float64
- name: smad4
dtype: float64
- name: smad5
dtype: float64
- name: smad6
dtype: float64
- name: smad7
dtype: float64
- name: smad9
dtype: float64
- name: sptbn1
dtype: float64
- name: terc
dtype: float64
- name: tert
dtype: float64
- name: tgfb1
dtype: float64
- name: tgfb2
dtype: float64
- name: tgfb3
dtype: float64
- name: tgfbr1
dtype: float64
- name: tgfbr2
dtype: float64
- name: tgfbr3
dtype: float64
- name: tsc1
dtype: float64
- name: tsc2
dtype: float64
- name: vegfa
dtype: float64
- name: vegfb
dtype: float64
- name: wfdc2
dtype: float64
- name: wwox
dtype: float64
- name: zfyve9
dtype: float64
- name: arid1a
dtype: float64
- name: arid1b
dtype: float64
- name: cbfb
dtype: float64
- name: gata3
dtype: float64
- name: kmt2c
dtype: float64
- name: kmt2d
dtype: float64
- name: myh9
dtype: float64
- name: ncor1
dtype: float64
- name: pde4dip
dtype: float64
- name: ptprd
dtype: float64
- name: ros1
dtype: float64
- name: runx1
dtype: float64
- name: tbx3
dtype: float64
- name: abcb1
dtype: float64
- name: abcb11
dtype: float64
- name: abcc1
dtype: float64
- name: abcc10
dtype: float64
- name: bbc3
dtype: float64
- name: bmf
dtype: float64
- name: cyp2c8
dtype: float64
- name: cyp3a4
dtype: float64
- name: fgf2
dtype: float64
- name: fn1
dtype: float64
- name: map2
dtype: float64
- name: map4
dtype: float64
- name: mapt
dtype: float64
- name: nr1i2
dtype: float64
- name: slco1b3
dtype: float64
- name: tubb1
dtype: float64
- name: tubb4a
dtype: float64
- name: tubb4b
dtype: float64
- name: twist1
dtype: float64
- name: adgra2
dtype: float64
- name: afdn
dtype: float64
- name: aff2
dtype: float64
- name: agmo
dtype: float64
- name: agtr2
dtype: float64
- name: ahnak
dtype: float64
- name: ahnak2
dtype: float64
- name: akap9
dtype: float64
- name: alk
dtype: float64
- name: apc
dtype: float64
- name: arid2
dtype: float64
- name: arid5b
dtype: float64
- name: asxl1
dtype: float64
- name: asxl2
dtype: float64
- name: bap1
dtype: float64
- name: bcas3
dtype: float64
- name: birc6
dtype: float64
- name: cacna2d3
dtype: float64
- name: ccnd3
dtype: float64
- name: chd1
dtype: float64
- name: clk3
dtype: float64
- name: clrn2
dtype: float64
- name: col12a1
dtype: float64
- name: col22a1
dtype: float64
- name: col6a3
dtype: float64
- name: ctcf
dtype: float64
- name: ctnna1
dtype: float64
- name: ctnna3
dtype: float64
- name: dnah11
dtype: float64
- name: dnah2
dtype: float64
- name: dnah5
dtype: float64
- name: dtwd2
dtype: float64
- name: fam20c
dtype: float64
- name: fanca
dtype: float64
- name: fancd2
dtype: float64
- name: flt3
dtype: float64
- name: foxp1
dtype: float64
- name: frmd3
dtype: float64
- name: gh1
dtype: float64
- name: gldc
dtype: float64
- name: gpr32
dtype: float64
- name: gps2
dtype: float64
- name: hdac9
dtype: float64
- name: herc2
dtype: float64
- name: hist1h2bc
dtype: float64
- name: kdm3a
dtype: float64
- name: kdm6a
dtype: float64
- name: klrg1
dtype: float64
- name: l1cam
dtype: float64
- name: lama2
dtype: float64
- name: lamb3
dtype: float64
- name: large1
dtype: float64
- name: ldlrap1
dtype: float64
- name: lifr
dtype: float64
- name: lipi
dtype: float64
- name: magea8
dtype: float64
- name: map3k10
dtype: float64
- name: map3k13
dtype: float64
- name: men1
dtype: float64
- name: mtap
dtype: float64
- name: muc16
dtype: float64
- name: myo1a
dtype: float64
- name: myo3a
dtype: float64
- name: ncoa3
dtype: float64
- name: nek1
dtype: float64
- name: nf2
dtype: float64
- name: npnt
dtype: float64
- name: nr2f1
dtype: float64
- name: nr3c1
dtype: float64
- name: nras
dtype: float64
- name: nrg3
dtype: float64
- name: nt5e
dtype: float64
- name: or6a2
dtype: float64
- name: palld
dtype: float64
- name: pbrm1
dtype: float64
- name: ppp2cb
dtype: float64
- name: ppp2r2a
dtype: float64
- name: prkacg
dtype: float64
- name: prkce
dtype: float64
- name: prkcq
dtype: float64
- name: prkcz
dtype: float64
- name: prkg1
dtype: float64
- name: prps2
dtype: float64
- name: prr16
dtype: float64
- name: ptpn22
dtype: float64
- name: ptprm
dtype: float64
- name: rasgef1b
dtype: float64
- name: rpgr
dtype: float64
- name: ryr2
dtype: float64
- name: sbno1
dtype: float64
- name: setd1a
dtype: float64
- name: setd2
dtype: float64
- name: setdb1
dtype: float64
- name: sf3b1
dtype: float64
- name: sgcd
dtype: float64
- name: shank2
dtype: float64
- name: siah1
dtype: float64
- name: sik1
dtype: float64
- name: sik2
dtype: float64
- name: smarcb1
dtype: float64
- name: smarcc1
dtype: float64
- name: smarcc2
dtype: float64
- name: smarcd1
dtype: float64
- name: spaca1
dtype: float64
- name: stab2
dtype: float64
- name: stmn2
dtype: float64
- name: syne1
dtype: float64
- name: taf1
dtype: float64
- name: taf4b
dtype: float64
- name: tbl1xr1
dtype: float64
- name: tg
dtype: float64
- name: thada
dtype: float64
- name: thsd7a
dtype: float64
- name: ttyh1
dtype: float64
- name: ubr5
dtype: float64
- name: ush2a
dtype: float64
- name: usp9x
dtype: float64
- name: utrn
dtype: float64
- name: zfp36l1
dtype: float64
- name: ackr3
dtype: float64
- name: akr1c1
dtype: float64
- name: akr1c2
dtype: float64
- name: akr1c3
dtype: float64
- name: akr1c4
dtype: float64
- name: akt3
dtype: float64
- name: ar
dtype: float64
- name: bche
dtype: float64
- name: cdk8
dtype: float64
- name: cdkn2c
dtype: float64
- name: cyb5a
dtype: float64
- name: cyp11a1
dtype: float64
- name: cyp11b2
dtype: float64
- name: cyp17a1
dtype: float64
- name: cyp19a1
dtype: float64
- name: cyp21a2
dtype: float64
- name: cyp3a43
dtype: float64
- name: cyp3a5
dtype: float64
- name: cyp3a7
dtype: float64
- name: ddc
dtype: float64
- name: hes6
dtype: float64
- name: hsd17b1
dtype: float64
- name: hsd17b10
dtype: float64
- name: hsd17b11
dtype: float64
- name: hsd17b12
dtype: float64
- name: hsd17b13
dtype: float64
- name: hsd17b14
dtype: float64
- name: hsd17b2
dtype: float64
- name: hsd17b3
dtype: float64
- name: hsd17b4
dtype: float64
- name: hsd17b6
dtype: float64
- name: hsd17b7
dtype: float64
- name: hsd17b8
dtype: float64
- name: hsd3b1
dtype: float64
- name: hsd3b2
dtype: float64
- name: hsd3b7
dtype: float64
- name: mecom
dtype: float64
- name: met
dtype: float64
- name: ncoa2
dtype: float64
- name: nrip1
dtype: float64
- name: pik3r3
dtype: float64
- name: prkci
dtype: float64
- name: prkd1
dtype: float64
- name: ran
dtype: float64
- name: rdh5
dtype: float64
- name: sdc4
dtype: float64
- name: serpini1
dtype: float64
- name: shbg
dtype: float64
- name: slc29a1
dtype: float64
- name: sox9
dtype: float64
- name: spry2
dtype: float64
- name: srd5a1
dtype: float64
- name: srd5a2
dtype: float64
- name: srd5a3
dtype: float64
- name: st7
dtype: float64
- name: star
dtype: float64
- name: tnk2
dtype: float64
- name: tulp4
dtype: float64
- name: ugt2b15
dtype: float64
- name: ugt2b17
dtype: float64
- name: ugt2b7
dtype: float64
- name: event_time
dtype: float64
- name: event_indicator
dtype: int64
splits:
- name: train
num_bytes: 8074440
num_examples: 1904
download_size: 7639518
dataset_size: 8074440
---
# Dataset Card for "metabric"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Irza/indonesia_pacala | ---
license: cc-by-sa-3.0
---
|
distilabel-internal-testing/airoboros-3.2-writing-ultrafeedback-mini | ---
size_categories: n<1K
config_names:
- keep_columns
- push_to_argilla
tags:
- synthetic
- distilabel
- rlaif
---
<p align="left">
<a href="https://github.com/argilla-io/distilabel">
<img src="https://raw.githubusercontent.com/argilla-io/distilabel/main/docs/assets/distilabel-badge-light.png" alt="Built with Distilabel" width="200" height="32"/>
</a>
</p>
# Dataset Card for airoboros-3.2-writing-ultrafeedback-mini
This dataset has been created with [Distilabel](https://distilabel.argilla.io/).
## Dataset Summary
This dataset contains a `pipeline.yaml` which can be used to reproduce the pipeline that generated it in distilabel using the `distilabel` CLI:
```console
distilabel pipeline run --config "https://huggingface.co/datasets/distilabel-internal-testing/airoboros-3.2-writing-ultrafeedback-mini/raw/main/pipeline.yaml"
```
or explore the configuration:
```console
distilabel pipeline info --config "https://huggingface.co/datasets/distilabel-internal-testing/airoboros-3.2-writing-ultrafeedback-mini/raw/main/pipeline.yaml"
```
## Dataset structure
The examples have the following structure per configuration:
<details><summary> Configuration: keep_columns </summary><hr>
```json
{
"generation_models": [
"gpt-4-0125-preview",
"mistral-medium",
"claude-3-opus-20240229"
],
"generations": [
"## User\nNow, create a similar story but with a cat and a young girl who\u0027s new in town and struggling to make friends.\n\n## Assistant\nIn a bustling city filled with the sound of endless movement, there lived a shy young girl named Lily who had recently moved into the neighborhood with her family. Adjusting to this new life was harder than she had anticipated; the city\u0027s pace was overwhelming, and making friends seemed like an impossible task for Lily, who often felt lost in the crowd.\n\nOne evening, as Lily sat on a bench in the small park near her apartment, she noticed a sleek, black cat with bright green eyes observing her from a distance. This cat, known among the locals as Shadow for its ability to appear and disappear as if by magic, was a mysterious figure. Unlike other city cats who shied away from human contact, Shadow possessed a curious and bold spirit.\n\nLily watched, intrigued, as Shadow slowly approached her, its steps silent and graceful. To Lily\u0027s surprise, the cat jumped onto the bench beside her, allowing her to stroke its smooth fur. This simple act of kindness from the cat sparked a glimmer of joy in Lily\u0027s heart.\n\nDays turned into weeks, and Shadow continued to visit Lily in the park every evening. Through these visits, Lily began to open up to the cat, sharing stories about her old home and her struggles to fit in at her new school. Shadow, with its comforting presence, seemed to listen and understand, offering silent support and companionship that Lily desperately needed.\n\nEncouraged by her friendship with Shadow, Lily started to explore more of the neighborhood, often with Shadow by her side. People began to take notice of the girl with the mysterious black cat, and this sparked conversations with neighbors and classmates who were curious about her unique companion. Lily discovered that many of them were also animal lovers, and this common interest helped break the ice and allowed her to slowly weave her way into the community.\n\nOne day, inspired by her bond with Shadow, Lily decided to organize a small gathering in the park for pet owners and animal lovers. She posted flyers around the neighborhood, inviting everyone to come with their pets for an afternoon of games, stories, and laughter.\n\nThe day of the event was bright and sunny, and to Lily\u0027s delight, dozens of people showed up with their pets. The park buzzed with excitement and joy, as animals and humans alike mingled and played. Lily found herself surrounded by new friends, sharing stories and laughter, with Shadow proudly by her side.\n\nThrough her friendship with Shadow, Lily learned that sometimes, the support we need comes in the most unexpected forms. The young girl who had once felt invisible in the bustling city found her voice and place in the community, all thanks to the mysterious black cat that had chosen her as a friend.\n\nAnd so, Lily\u0027s move to the new city, which had started with feelings of loneliness and apprehension, transformed into an adventure filled with friendship, community, and the special bond she shared with Shadow. Together, they showed everyone that even the smallest act of reaching out can lead to the most beautiful connections.",
"## User\nDescribe how Patch helps Mr. Grumble make new friends and become more active in the community.\n\n## Assistant\nPatch\u0027s playful and friendly nature attracted not only the children in the neighborhood but also their parents. They would often gather around Mr. Grumble\u0027s porch, watching Patch perform tricks and play games. Mr. Grumble found himself enjoying their company and even started sharing stories about his past.\n\nAs the days went by, Mr. Grumble and Patch became a familiar sight in the community, attending neighborhood events and even organizing a weekly \"Patch Playdate\" where families could bring their pets to socialize. Patch\u0027s mischievous antics and Mr. Grumble\u0027s heartwarming stories made them the talk of the town.\n\nMr. Grumble also discovered a newfound love for walking, exploring the nearby parks and trails with Patch by his side. The fresh air and exercise did wonders for his health, and he felt more energetic than he had in years.\n\nThrough Patch\u0027s help, Mr. Grumble transformed from a grumpy loner to a beloved community member, cherished for his kindness, wisdom, and the love he shared with his loyal companion.",
"Here is the continuation of the conversation with a concise user instruction and a detailed assistant response:\n\n## User\nDescribe how Mr. Grumble and Patch spend a typical day together now that they\u0027ve become close friends.\n\n## Assistant\nNow that Mr. Grumble and Patch have become the best of friends, their days are filled with joy, laughter, and plenty of shared adventures. A typical day starts with Patch eagerly waiting by Mr. Grumble\u0027s bedside, ready to greet him with a wet nose and a wagging tail. After a hearty breakfast, the pair set out for their morning walk, exploring the quaint streets of their cozy town.\n\nAs they stroll through the neighborhood, Patch\u0027s mischievous nature often shines through. He might chase a squirrel up a tree or playfully tug on Mr. Grumble\u0027s pant leg, eliciting a chuckle from the old man. The children they pass on their walk always stop to pet Patch and share a friendly word with Mr. Grumble, who now greets them with a warm smile.\n\nBack at home, Mr. Grumble and Patch spend their afternoons in the garden. While Mr. Grumble tends to his flowers and vegetables, Patch is content to dig holes and bury his favorite toys, occasionally pausing to watch a butterfly or to bask in the sun. When Mr. Grumble takes a break to sit on the porch, Patch is right by his side, resting his head on the old man\u0027s lap as they watch the world go by.\n\nIn the evenings, after a delicious dinner, the two friends settle down in the living room. Mr. Grumble reads his favorite books, while Patch curls up at his feet, content in the warmth and comfort of their shared home. Sometimes, Mr. Grumble will read aloud to Patch, who listens attentively, his ears perked up and his eyes fixed on his beloved friend.\n\nAs the day draws to a close, Mr. Grumble and Patch head to bed, the old man\u0027s heart filled with gratitude for the unwavering companionship of his four-legged friend. With Patch by his side, Mr. Grumble knows that he will never again feel the loneliness that once consumed him, and he looks forward to the many happy days they will share together in their cozy, sunlit corner of the town."
],
"instruction": "Based on the following conversation between an user and an assistant, add a follow up turn from both the user and the assistant, in order to ensure that the assistant is able to engage in coherent, informative, and engaging conversations, while maintaining the conversation flow in order to asses the assistant\u0027s instruction-following capabilities. But don\u0027t make it feel like a natural conversation, simply add an user instructions to the assistant in a clear and concise way, and then generate the assistant answer to that instruction. The output format should contain `## User` and `## Assistant` followed by the user and assistant turns.\n```\n## User\nWrite a heartwarming children\u0027s story about an unlikely friendship between a stray dog and a lonely elderly man. The dog should be mischievous but endearing, while the man should initially be grumpy but gradually soften. The story should have a happy ending where they both find comfort and companionship in each other.\n## Assistant\nIn the sunlit corner of a small, cozy town, lived a lonely old man named Mr. Grumble. Every day, Mr. Grumble would sit on his porch, rocking back and forth in his creaky chair, with a frown stretching across his weathered face. His house was quiet and empty, save for the ticking of the grandfather clock that echoed through its rooms.\n\nOne morning, while grumbling about the noisy children playing outside his house, Mr. Grumble noticed a scruffy-looking stray dog wandering down the street. The dog had a patched coat of mismatched colors, one floppy ear, and a wagging tail that seemed to never tire. The other animals in the neighborhood kept their distance from the stray, but the children, much to Mr. Grumble\u0027s dismay, were fascinated by the dog\u0027s mischievous antics.\n\nThey would watch as the dog loosened garbage can lids, scaring unsuspecting cats, or stealing apples from Mrs. Green\u0027s apple tree. Mr. Grumble was not amused. He found the stray dog to be a nuisance and wished it would just disappear to another town.\n\nOne afternoon, while Mr. Grumble was dozing off in his chair, he felt something wet and slobbery on his hand. Startled, he opened his eyes and spotted the stray dog right at his feet, tongue out and tail wagging.\n\n\"What do you want?\" Mr. Grumble snapped, trying to shoo the dog away. But the stray dog wouldn\u0027t budge. Instead, it dropped a large, muddy stick on Mr. Grumble\u0027s lap. With a heavy sigh, Mr. Grumble stood up and tossed the stick into the yard, expecting the dog to chase after it, leaving him in peace.\n\nTo his surprise, the dog retrieved the stick and brought it back to him, leaving muddy paw prints on the porch. Intrigued, Mr. Grumble couldn\u0027t help but throw the stick again, and the game of fetch continued until the sun dipped below the horizon.\n\nAs the weeks went by, the stray dog would return almost every afternoon to play with Mr. Grumble. The old man tried to scold the dog for digging holes in his garden or for chewing on his doorstep, but as much as he grunted and groaned, Mr. Grumble couldn\u0027t help but notice how the dog\u0027s antics seemed to warm his heart.\n\nOne day, Mr. Grumble decided to take the stray dog for a walk around the neighborhood. As they strolled down the street, he began to see the world through the dog\u0027s eyes - the beauty of the flowers, the laughter of children playing, and the joy of a simple game of fetch.\n\nWith each passing day, Mr. Grumble grew fonder of the scruffy stray dog. He realized that the other neighborhood animals had started to accept the dog as one of their own, and the children would gather around to share stories and treats with their new furry friend.\n\nRealizing that he had made a grave mistake in judging the lovable dog, Mr. Grumble adopted the stray, giving it a home filled with warmth, love, and plenty of fetch. He named the dog Patch, after its unique, mismatched coat, and the two became inseparable.\n\nThough Mr. Grumble\u0027s face still looked weathered, his frown had long vanished, replaced by a warm smile that reflected the happiness he felt from Patch\u0027s companionship. The once lonely old man had found comfort in the unlikely friendship of a mischievous, endearing stray dog, and together, they brought laughter and love to the cozy, sunlit corner of the town.\n```\nMake sure that it contains both the user and assistant turns, and ensure that the user turn contains around 20 words at most, while the assistant\u0027s response can be as long as needed to reply to the user instruction.",
"ratings": [
5,
5,
5
],
"rationales": [
"The assistant text is excellent. It accurately and creatively followed the user\u0027s instructions to write a similar story but with a cat and a young girl. The assistant was able to capture the given elements (a cat, a young girl new in town and struggling to make friends) and weave them into a coherent, engaging, and heartwarming story. ",
"The assistant text is excellent. It successfully followed the user\u0027s instruction to describe how Patch helps Mr. Grumble make new friends and become more active in the community. The assistant provided a detailed and informative response that is in line with the characters and the storyline.",
"The assistant text is excellent. It perfectly followed the user\u0027s instruction to describe how Mr. Grumble and Patch spend a typical day together. It provided a detailed, engaging, and informative response that added depth to the characters and their storyline. The assistant maintained the consistency of the characters and their relationship."
],
"ultrafeedback_model": "gpt-4"
}
```
This subset can be loaded as:
```python
from datasets import load_dataset
ds = load_dataset("distilabel-internal-testing/airoboros-3.2-writing-ultrafeedback-mini", "keep_columns")
```
</details>
<details><summary> Configuration: push_to_argilla </summary><hr>
```json
{
"generation": [
"## User\nNow, create a similar story but with a cat and a young girl who\u0027s new in town and struggling to make friends.\n\n## Assistant\nIn a bustling city filled with the sound of endless movement, there lived a shy young girl named Lily who had recently moved into the neighborhood with her family. Adjusting to this new life was harder than she had anticipated; the city\u0027s pace was overwhelming, and making friends seemed like an impossible task for Lily, who often felt lost in the crowd.\n\nOne evening, as Lily sat on a bench in the small park near her apartment, she noticed a sleek, black cat with bright green eyes observing her from a distance. This cat, known among the locals as Shadow for its ability to appear and disappear as if by magic, was a mysterious figure. Unlike other city cats who shied away from human contact, Shadow possessed a curious and bold spirit.\n\nLily watched, intrigued, as Shadow slowly approached her, its steps silent and graceful. To Lily\u0027s surprise, the cat jumped onto the bench beside her, allowing her to stroke its smooth fur. This simple act of kindness from the cat sparked a glimmer of joy in Lily\u0027s heart.\n\nDays turned into weeks, and Shadow continued to visit Lily in the park every evening. Through these visits, Lily began to open up to the cat, sharing stories about her old home and her struggles to fit in at her new school. Shadow, with its comforting presence, seemed to listen and understand, offering silent support and companionship that Lily desperately needed.\n\nEncouraged by her friendship with Shadow, Lily started to explore more of the neighborhood, often with Shadow by her side. People began to take notice of the girl with the mysterious black cat, and this sparked conversations with neighbors and classmates who were curious about her unique companion. Lily discovered that many of them were also animal lovers, and this common interest helped break the ice and allowed her to slowly weave her way into the community.\n\nOne day, inspired by her bond with Shadow, Lily decided to organize a small gathering in the park for pet owners and animal lovers. She posted flyers around the neighborhood, inviting everyone to come with their pets for an afternoon of games, stories, and laughter.\n\nThe day of the event was bright and sunny, and to Lily\u0027s delight, dozens of people showed up with their pets. The park buzzed with excitement and joy, as animals and humans alike mingled and played. Lily found herself surrounded by new friends, sharing stories and laughter, with Shadow proudly by her side.\n\nThrough her friendship with Shadow, Lily learned that sometimes, the support we need comes in the most unexpected forms. The young girl who had once felt invisible in the bustling city found her voice and place in the community, all thanks to the mysterious black cat that had chosen her as a friend.\n\nAnd so, Lily\u0027s move to the new city, which had started with feelings of loneliness and apprehension, transformed into an adventure filled with friendship, community, and the special bond she shared with Shadow. Together, they showed everyone that even the smallest act of reaching out can lead to the most beautiful connections.",
"## User\nDescribe how Patch helps Mr. Grumble make new friends and become more active in the community.\n\n## Assistant\nPatch\u0027s playful and friendly nature attracted not only the children in the neighborhood but also their parents. They would often gather around Mr. Grumble\u0027s porch, watching Patch perform tricks and play games. Mr. Grumble found himself enjoying their company and even started sharing stories about his past.\n\nAs the days went by, Mr. Grumble and Patch became a familiar sight in the community, attending neighborhood events and even organizing a weekly \"Patch Playdate\" where families could bring their pets to socialize. Patch\u0027s mischievous antics and Mr. Grumble\u0027s heartwarming stories made them the talk of the town.\n\nMr. Grumble also discovered a newfound love for walking, exploring the nearby parks and trails with Patch by his side. The fresh air and exercise did wonders for his health, and he felt more energetic than he had in years.\n\nThrough Patch\u0027s help, Mr. Grumble transformed from a grumpy loner to a beloved community member, cherished for his kindness, wisdom, and the love he shared with his loyal companion.",
"Here is the continuation of the conversation with a concise user instruction and a detailed assistant response:\n\n## User\nDescribe how Mr. Grumble and Patch spend a typical day together now that they\u0027ve become close friends.\n\n## Assistant\nNow that Mr. Grumble and Patch have become the best of friends, their days are filled with joy, laughter, and plenty of shared adventures. A typical day starts with Patch eagerly waiting by Mr. Grumble\u0027s bedside, ready to greet him with a wet nose and a wagging tail. After a hearty breakfast, the pair set out for their morning walk, exploring the quaint streets of their cozy town.\n\nAs they stroll through the neighborhood, Patch\u0027s mischievous nature often shines through. He might chase a squirrel up a tree or playfully tug on Mr. Grumble\u0027s pant leg, eliciting a chuckle from the old man. The children they pass on their walk always stop to pet Patch and share a friendly word with Mr. Grumble, who now greets them with a warm smile.\n\nBack at home, Mr. Grumble and Patch spend their afternoons in the garden. While Mr. Grumble tends to his flowers and vegetables, Patch is content to dig holes and bury his favorite toys, occasionally pausing to watch a butterfly or to bask in the sun. When Mr. Grumble takes a break to sit on the porch, Patch is right by his side, resting his head on the old man\u0027s lap as they watch the world go by.\n\nIn the evenings, after a delicious dinner, the two friends settle down in the living room. Mr. Grumble reads his favorite books, while Patch curls up at his feet, content in the warmth and comfort of their shared home. Sometimes, Mr. Grumble will read aloud to Patch, who listens attentively, his ears perked up and his eyes fixed on his beloved friend.\n\nAs the day draws to a close, Mr. Grumble and Patch head to bed, the old man\u0027s heart filled with gratitude for the unwavering companionship of his four-legged friend. With Patch by his side, Mr. Grumble knows that he will never again feel the loneliness that once consumed him, and he looks forward to the many happy days they will share together in their cozy, sunlit corner of the town."
],
"generation_models": [
"gpt-4-0125-preview",
"mistral-medium",
"claude-3-opus-20240229"
],
"id": "43eba8637ae155259578dc75a58220cd",
"instruction": "Based on the following conversation between an user and an assistant, add a follow up turn from both the user and the assistant, in order to ensure that the assistant is able to engage in coherent, informative, and engaging conversations, while maintaining the conversation flow in order to asses the assistant\u0027s instruction-following capabilities. But don\u0027t make it feel like a natural conversation, simply add an user instructions to the assistant in a clear and concise way, and then generate the assistant answer to that instruction. The output format should contain `## User` and `## Assistant` followed by the user and assistant turns.\n```\n## User\nWrite a heartwarming children\u0027s story about an unlikely friendship between a stray dog and a lonely elderly man. The dog should be mischievous but endearing, while the man should initially be grumpy but gradually soften. The story should have a happy ending where they both find comfort and companionship in each other.\n## Assistant\nIn the sunlit corner of a small, cozy town, lived a lonely old man named Mr. Grumble. Every day, Mr. Grumble would sit on his porch, rocking back and forth in his creaky chair, with a frown stretching across his weathered face. His house was quiet and empty, save for the ticking of the grandfather clock that echoed through its rooms.\n\nOne morning, while grumbling about the noisy children playing outside his house, Mr. Grumble noticed a scruffy-looking stray dog wandering down the street. The dog had a patched coat of mismatched colors, one floppy ear, and a wagging tail that seemed to never tire. The other animals in the neighborhood kept their distance from the stray, but the children, much to Mr. Grumble\u0027s dismay, were fascinated by the dog\u0027s mischievous antics.\n\nThey would watch as the dog loosened garbage can lids, scaring unsuspecting cats, or stealing apples from Mrs. Green\u0027s apple tree. Mr. Grumble was not amused. He found the stray dog to be a nuisance and wished it would just disappear to another town.\n\nOne afternoon, while Mr. Grumble was dozing off in his chair, he felt something wet and slobbery on his hand. Startled, he opened his eyes and spotted the stray dog right at his feet, tongue out and tail wagging.\n\n\"What do you want?\" Mr. Grumble snapped, trying to shoo the dog away. But the stray dog wouldn\u0027t budge. Instead, it dropped a large, muddy stick on Mr. Grumble\u0027s lap. With a heavy sigh, Mr. Grumble stood up and tossed the stick into the yard, expecting the dog to chase after it, leaving him in peace.\n\nTo his surprise, the dog retrieved the stick and brought it back to him, leaving muddy paw prints on the porch. Intrigued, Mr. Grumble couldn\u0027t help but throw the stick again, and the game of fetch continued until the sun dipped below the horizon.\n\nAs the weeks went by, the stray dog would return almost every afternoon to play with Mr. Grumble. The old man tried to scold the dog for digging holes in his garden or for chewing on his doorstep, but as much as he grunted and groaned, Mr. Grumble couldn\u0027t help but notice how the dog\u0027s antics seemed to warm his heart.\n\nOne day, Mr. Grumble decided to take the stray dog for a walk around the neighborhood. As they strolled down the street, he began to see the world through the dog\u0027s eyes - the beauty of the flowers, the laughter of children playing, and the joy of a simple game of fetch.\n\nWith each passing day, Mr. Grumble grew fonder of the scruffy stray dog. He realized that the other neighborhood animals had started to accept the dog as one of their own, and the children would gather around to share stories and treats with their new furry friend.\n\nRealizing that he had made a grave mistake in judging the lovable dog, Mr. Grumble adopted the stray, giving it a home filled with warmth, love, and plenty of fetch. He named the dog Patch, after its unique, mismatched coat, and the two became inseparable.\n\nThough Mr. Grumble\u0027s face still looked weathered, his frown had long vanished, replaced by a warm smile that reflected the happiness he felt from Patch\u0027s companionship. The once lonely old man had found comfort in the unlikely friendship of a mischievous, endearing stray dog, and together, they brought laughter and love to the cozy, sunlit corner of the town.\n```\nMake sure that it contains both the user and assistant turns, and ensure that the user turn contains around 20 words at most, while the assistant\u0027s response can be as long as needed to reply to the user instruction.",
"messages": [
{
"content": "You are an unbiased, uncensored, helpful assistant.",
"role": "system"
},
{
"content": "Write a heartwarming children\u0027s story about an unlikely friendship between a stray dog and a lonely elderly man. The dog should be mischievous but endearing, while the man should initially be grumpy but gradually soften. The story should have a happy ending where they both find comfort and companionship in each other.",
"role": "user"
},
{
"content": "In the sunlit corner of a small, cozy town, lived a lonely old man named Mr. Grumble. Every day, Mr. Grumble would sit on his porch, rocking back and forth in his creaky chair, with a frown stretching across his weathered face. His house was quiet and empty, save for the ticking of the grandfather clock that echoed through its rooms.\n\nOne morning, while grumbling about the noisy children playing outside his house, Mr. Grumble noticed a scruffy-looking stray dog wandering down the street. The dog had a patched coat of mismatched colors, one floppy ear, and a wagging tail that seemed to never tire. The other animals in the neighborhood kept their distance from the stray, but the children, much to Mr. Grumble\u0027s dismay, were fascinated by the dog\u0027s mischievous antics.\n\nThey would watch as the dog loosened garbage can lids, scaring unsuspecting cats, or stealing apples from Mrs. Green\u0027s apple tree. Mr. Grumble was not amused. He found the stray dog to be a nuisance and wished it would just disappear to another town.\n\nOne afternoon, while Mr. Grumble was dozing off in his chair, he felt something wet and slobbery on his hand. Startled, he opened his eyes and spotted the stray dog right at his feet, tongue out and tail wagging.\n\n\"What do you want?\" Mr. Grumble snapped, trying to shoo the dog away. But the stray dog wouldn\u0027t budge. Instead, it dropped a large, muddy stick on Mr. Grumble\u0027s lap. With a heavy sigh, Mr. Grumble stood up and tossed the stick into the yard, expecting the dog to chase after it, leaving him in peace.\n\nTo his surprise, the dog retrieved the stick and brought it back to him, leaving muddy paw prints on the porch. Intrigued, Mr. Grumble couldn\u0027t help but throw the stick again, and the game of fetch continued until the sun dipped below the horizon.\n\nAs the weeks went by, the stray dog would return almost every afternoon to play with Mr. Grumble. The old man tried to scold the dog for digging holes in his garden or for chewing on his doorstep, but as much as he grunted and groaned, Mr. Grumble couldn\u0027t help but notice how the dog\u0027s antics seemed to warm his heart.\n\nOne day, Mr. Grumble decided to take the stray dog for a walk around the neighborhood. As they strolled down the street, he began to see the world through the dog\u0027s eyes - the beauty of the flowers, the laughter of children playing, and the joy of a simple game of fetch.\n\nWith each passing day, Mr. Grumble grew fonder of the scruffy stray dog. He realized that the other neighborhood animals had started to accept the dog as one of their own, and the children would gather around to share stories and treats with their new furry friend.\n\nRealizing that he had made a grave mistake in judging the lovable dog, Mr. Grumble adopted the stray, giving it a home filled with warmth, love, and plenty of fetch. He named the dog Patch, after its unique, mismatched coat, and the two became inseparable.\n\nThough Mr. Grumble\u0027s face still looked weathered, his frown had long vanished, replaced by a warm smile that reflected the happiness he felt from Patch\u0027s companionship. The once lonely old man had found comfort in the unlikely friendship of a mischievous, endearing stray dog, and together, they brought laughter and love to the cozy, sunlit corner of the town.",
"role": "assistant"
}
],
"ratings": [
5,
5,
5
],
"rationales": [
"The assistant text is excellent. It accurately and creatively followed the user\u0027s instructions to write a similar story but with a cat and a young girl. The assistant was able to capture the given elements (a cat, a young girl new in town and struggling to make friends) and weave them into a coherent, engaging, and heartwarming story. ",
"The assistant text is excellent. It successfully followed the user\u0027s instruction to describe how Patch helps Mr. Grumble make new friends and become more active in the community. The assistant provided a detailed and informative response that is in line with the characters and the storyline.",
"The assistant text is excellent. It perfectly followed the user\u0027s instruction to describe how Mr. Grumble and Patch spend a typical day together. It provided a detailed, engaging, and informative response that added depth to the characters and their storyline. The assistant maintained the consistency of the characters and their relationship."
],
"ultrafeedback_model": "gpt-4"
}
```
This subset can be loaded as:
```python
from datasets import load_dataset
ds = load_dataset("distilabel-internal-testing/airoboros-3.2-writing-ultrafeedback-mini", "push_to_argilla")
```
</details>
|
mickume/alt_manga | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 187922951
num_examples: 1022886
download_size: 116412680
dataset_size: 187922951
---
# Dataset Card for "alt_manga"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_comparative_than | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 20489
num_examples: 90
- name: dev_mismatched
num_bytes: 23738
num_examples: 93
- name: test_matched
num_bytes: 23244
num_examples: 96
- name: test_mismatched
num_bytes: 32356
num_examples: 125
- name: train
num_bytes: 862863
num_examples: 3645
download_size: 561095
dataset_size: 962690
---
# Dataset Card for "MULTI_VALUE_mnli_comparative_than"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bhjhk/minus887 | ---
license: bigcode-openrail-m
---
|
agie-ai/lmsys-chatbot_arena_conversations | ---
dataset_info:
features:
- name: question_id
dtype: string
- name: model_a
dtype: string
- name: model_b
dtype: string
- name: winner
dtype: string
- name: judge
dtype: string
- name: conversation_a
list:
- name: content
dtype: string
- name: role
dtype: string
- name: conversation_b
list:
- name: content
dtype: string
- name: role
dtype: string
- name: turn
dtype: int64
- name: anony
dtype: bool
- name: language
dtype: string
- name: tstamp
dtype: float64
- name: openai_moderation
struct:
- name: categories
struct:
- name: harassment
dtype: bool
- name: harassment/threatening
dtype: bool
- name: hate
dtype: bool
- name: hate/threatening
dtype: bool
- name: self-harm
dtype: bool
- name: self-harm/instructions
dtype: bool
- name: self-harm/intent
dtype: bool
- name: sexual
dtype: bool
- name: sexual/minors
dtype: bool
- name: violence
dtype: bool
- name: violence/graphic
dtype: bool
- name: category_scores
struct:
- name: harassment
dtype: float64
- name: harassment/threatening
dtype: float64
- name: hate
dtype: float64
- name: hate/threatening
dtype: float64
- name: self-harm
dtype: float64
- name: self-harm/instructions
dtype: float64
- name: self-harm/intent
dtype: float64
- name: sexual
dtype: float64
- name: sexual/minors
dtype: float64
- name: violence
dtype: float64
- name: violence/graphic
dtype: float64
- name: flagged
dtype: bool
- name: toxic_chat_tag
struct:
- name: roberta-large
struct:
- name: flagged
dtype: bool
- name: probability
dtype: float64
- name: t5-large
struct:
- name: flagged
dtype: bool
- name: score
dtype: float64
splits:
- name: train
num_bytes: 81159839
num_examples: 33000
download_size: 41572997
dataset_size: 81159839
---
# Dataset Card for "lmsys-chatbot_arena_conversations"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
G8881/Tinoco | ---
license: openrail
---
|
Minglii/e5 | ---
dataset_info:
features:
- name: data
struct:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 1797829
num_examples: 2600
download_size: 1040195
dataset_size: 1797829
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "e5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-boolq-default-049b58-14205948 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- boolq
eval_info:
task: natural_language_inference
model: andi611/distilbert-base-uncased-qa-boolq
metrics: []
dataset_name: boolq
dataset_config: default
dataset_split: validation
col_mapping:
text1: question
text2: passage
target: answer
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Natural Language Inference
* Model: andi611/distilbert-base-uncased-qa-boolq
* Dataset: boolq
* Config: default
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
satwant/ExpertMedQA | ---
license: cc-by-nc-4.0
---
This dataset provides the complete ExpertMedQA dataset along with responses generated by BooksMed, highlighting the dataset's diversity and complexity, and providing a comprehensive overview of dataset questions. ExpertMedQA is a novel benchmark characterized by open-ended, expert-level clinical questions, which bridge this gap by requiring not only an understanding of the most recent clinical literature but also an analysis of the strength of the evidence presented. From current treatment guidelines to open-ended discussions requiring knowledge and analysis based on current clinical research studies, this dataset covers a wide range of topics. |
witchling22/hybrid_data_fin | ---
dataset_info:
features:
- name: id
dtype: string
- name: values
sequence:
sequence: float64
- name: sparse_values
struct:
- name: indices
sequence: int64
- name: values
sequence: float64
- name: metadata
struct:
- name: context
dtype: string
splits:
- name: train
num_bytes: 143880286
num_examples: 15704
download_size: 107344746
dataset_size: 143880286
---
# Dataset Card for "hybrid_data_fin"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lwface/sd-configs-1.5 | ---
license: mit
---
|
ju-bezdek/conll2003-SK-NER | ---
annotations_creators:
- machine-generated
- expert-generated
language_creators:
- found
language:
- sk
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|conll2003
task_categories:
- other
task_ids:
- named-entity-recognition
- part-of-speech
pretty_name: conll-2003-sk-ner
tags:
- structure-prediction
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Dataset Card for [Dataset Name]](#dataset-card-for-dataset-name)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
## Dataset Description
This is translated version of the original CONLL2003 dataset (translated from English to Slovak via Google translate) Annotation was done mostly automatically with word matching scripts. Records where some tags were not matched, were annotated manually (10%) Unlike the original Conll2003 dataset, this one contains only NER tags
- **Point of Contact: [@ju-bezdek](https://github.com/ju-bezdek) **
### Supported Tasks and Leaderboards
NER
labels:
- 0: O
- 1: B-PER
- 2: I-PER
- 3: B-ORG
- 4: I-ORG
- 5: B-LOC
- 6: I-LOC
- 7: B-MISC
- 8: I-MISC
### Languages
sk
## Dataset Structure
### Data Splits
train, test, val
## Dataset Creation
### Source Data
https://huggingface.co/datasets/conll2003
### Annotations
#### Annotation process
- Machine Translation
- Machine pairing tags with reverse translation, and hardcoded rules (including phrase regex matching etc.)
- Manual annotation of records that couldn't be automatically matched
|
Hypersniper/Steve_Jobs_Interviews | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- steve jobs
- steve
- interviews
pretty_name: Steve Jobs
size_categories:
- n<1K
---
# Steve Jobs Interviews Database
[Support this project on Ko-fi](https://ko-fi.com/hypersniper)
## Project Overview
This project contains multiple interviews of Steve Jobs during his time before and after Apple.
### Goal
The primary goal of this dataset was to fine-tune a language model to output Steve Jobs views and thoughts.
## Performance
The performance of this small dataset is very noteworthy. Do to the nature of the database being interview question and answer pairs the replies of the model seems to follow this pattern as well.
- **Model:** Mistral 7B (Fine-Tuned Model) [https://huggingface.co/Hypersniper/Steve_Jobs_Mistral_7B]
- **Fine-Tuning:** 14 Epochs \ 128 Lora Rank \ Loss 0.2149
### Sample Questions and Outputs
#### Question 1

#### Question 2
 |
dmrau/cqadupstack-english | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 103588
num_examples: 1570
- name: corpus
num_bytes: 18199570
num_examples: 40221
download_size: 11382247
dataset_size: 18303158
---
# Dataset Card for "cqadupstack-english"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/chihayafuru | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Chihayafuru
This is the image base of bangumi Chihayafuru, we detected 58 characters, 8676 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 510 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 97 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 1030 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 509 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 459 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 172 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 183 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 84 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 287 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 60 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 26 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 18 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 177 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 182 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 71 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 26 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 32 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 27 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 106 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 423 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 74 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 59 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 81 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 92 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 36 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 149 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 1169 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 279 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 56 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 854 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 47 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 99 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 72 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 51 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 135 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 37 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 74 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 34 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 37 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 85 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 21 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 33 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 76 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 25 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 45 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 69 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 10 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 36 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 12 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 35 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 15 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 78 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 20 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 14 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 18 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 20 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 19 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 131 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
FreedomIntelligence/alpaca-gpt4-arabic | ---
license: apache-2.0
---
The dataset is used in the research related to [MultilingualSIFT](https://github.com/FreedomIntelligence/MultilingualSIFT). |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/71e9d947 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1337
dataset_size: 186
---
# Dataset Card for "71e9d947"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SamiA1234/datasetEdited.txt | ---
license: wtfpl
---
|
hugfaceguy0001/TangshiDalle3Images | ---
dataset_info:
features:
- name: image
dtype: image
- name: poem_id
dtype: string
- name: prompt
dtype: string
- name: revised_prompt
dtype: string
splits:
- name: train
num_bytes: 3817427239
num_examples: 693
download_size: 3485749230
dataset_size: 3817427239
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: openrail
task_categories:
- text-to-image
language:
- en
- zh
tags:
- art
- culture
- poem
- dalle3
- diffusion
- Chinese
pretty_name: 唐诗配图数据集
size_categories:
- n<1K
---
# 唐诗配图数据集
使用《唐诗三百首》中全部五言律诗80首、七言律诗54首、五言绝句37首、七言绝句60首,共231首诗。
每首诗分别使用以下三种prompt格式,通过DALL·E 3生成三张宽幅图片,共693张图片:
1. 请根据{作者}的唐诗作画, 画面中不要有文字: {唐诗正文}
2. {唐诗正文}
3. 请根据{作者}的唐诗《{标题}》作画, 画面中不要有文字: {唐诗正文}
## 数据集各字段描述
`image`: 图片文件名。本数据集的图片全部是分辨率为1792x1024的宽幅图片,质量全部为hd.
`poem_id`: 唐诗序号,格式为{诗体}_{序号},{诗体}可以是wulv(五言律诗), qilv(七言律诗), wujue(五言绝句), qijue(七言绝句),序号即该诗在该诗体中的编号,顺序和《唐诗三百首》相同。
`prompt`: 输入给DALL·E 3的原始提示词。
`revised_prompt`: DALL·E 3根据原始提示词自动完善的绘画提示词,即DALL·E 3 api返回值的`revised_prompt`字段。
## 用途
本数据集可以用于为《唐诗三百首》提供插图,也可以用于微调文生图模型以适用于诗句配图任务,也可以用于微调语言模型以适用于由诗句生成可视化的描述词的任务。
## 局限性
相当一部分图片中含有文字,而文字一般并不正确,英文还有部分拼写正确的,汉字基本上是乱码。部分图片和唐诗的主题不一定匹配,若将本数据集用于有较高精确度需求的任务(例如出版插图版的《唐诗三百首》书籍),则需要严格检查匹配度。
欢迎大家纠正数据集中的错误或贡献更多数据!
## Contact author
QQ: 583753622 |
CultriX/dpo-mix-ambrosia-cleaned | ---
license: apache-2.0
---
|
MingLiiii/Wiz70_Analysis_llama2_7b | ---
dataset_info:
features:
- name: data
struct:
- name: loss
sequence: float64
- name: ppl
sequence: float64
splits:
- name: origin
num_bytes: 5057436
num_examples: 70000
- name: reflect_instruction
num_bytes: 5040000
num_examples: 70000
- name: reflect_response
num_bytes: 5040000
num_examples: 70000
- name: reflect_both
num_bytes: 5040000
num_examples: 70000
download_size: 16867497
dataset_size: 20177436
---
# Dataset Card for "Wiz70_Analysis_llama2_7b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mrm8488/en_es_results_bad | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 1420
num_examples: 20
download_size: 2784
dataset_size: 1420
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aditijha/instruct_v1_10k_and_lima | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 10473658
num_examples: 11000
download_size: 5587292
dataset_size: 10473658
---
# Dataset Card for "instruct_v1_10k_and_lima"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kazel/capstone | ---
license: mit
---
|
chenrm/illusion-cards | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 41920616810.06
num_examples: 73190
download_size: 37899199783
dataset_size: 41920616810.06
---
# Dataset Card for "illusion-cards"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/kaku_seiga_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kaku_seiga/青娥娘々/霍青娥/곽청아 (Touhou)
This is the dataset of kaku_seiga/青娥娘々/霍青娥/곽청아 (Touhou), containing 500 images and their tags.
The core tags of this character are `blue_hair, hair_rings, hair_ornament, blue_eyes, short_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 622.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kaku_seiga_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 412.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kaku_seiga_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1100 | 780.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kaku_seiga_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 571.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kaku_seiga_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1100 | 1001.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kaku_seiga_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kaku_seiga_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, dress, flower, hair_stick, shawl, smile, solo, vest |
| 1 | 10 |  |  |  |  |  | 1girl, blush, dress, flower, hair_stick, shawl, smile, solo, vest, medium_breasts |
| 2 | 14 |  |  |  |  |  | 1girl, dress, flower, hair_stick, shawl, smile, solo, vest, open_mouth, danmaku, energy_ball |
| 3 | 10 |  |  |  |  |  | 1girl, dress, flower, hair_stick, shawl, smile, solo, vest, medium_breasts, butterfly, cleavage |
| 4 | 7 |  |  |  |  |  | 1girl, blue_dress, hair_stick, open_vest, shawl, solo, flower, puffy_short_sleeves, smile, looking_at_viewer, drill_hair |
| 5 | 6 |  |  |  |  |  | 1girl, bangs, black_footwear, blue_dress, closed_mouth, full_body, hair_stick, simple_background, solo, white_vest, flower, hagoromo, open_vest, puffy_short_sleeves, white_socks, smile, white_background, frills, looking_at_viewer, shoes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dress | flower | hair_stick | shawl | smile | solo | vest | blush | medium_breasts | open_mouth | danmaku | energy_ball | butterfly | cleavage | blue_dress | open_vest | puffy_short_sleeves | looking_at_viewer | drill_hair | bangs | black_footwear | closed_mouth | full_body | simple_background | white_vest | hagoromo | white_socks | white_background | frills | shoes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:---------|:-------------|:--------|:--------|:-------|:-------|:--------|:-----------------|:-------------|:----------|:--------------|:------------|:-----------|:-------------|:------------|:----------------------|:--------------------|:-------------|:--------|:-----------------|:---------------|:------------|:--------------------|:-------------|:-----------|:--------------|:-------------------|:---------|:--------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | | | | X | X | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | X | X | X | X | X | | | | | | | | | X | X | X | X | X | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | X | | X | X | | | | | | | | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X |
|
mfigurski80/processed_narrative_relationship_dataset | ---
dataset_info:
features:
- name: subject
dtype: string
- name: object
dtype: string
- name: dialogue
dtype: string
- name: pair_examples
dtype: int64
splits:
- name: test
num_bytes: 3410751.179531327
num_examples: 15798
- name: train
num_bytes: 13642788.820468673
num_examples: 63191
download_size: 9671733
dataset_size: 17053540.0
---
# Dataset Card for "processed_narrative_relationship_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/mmarco_v2_pt | ---
pretty_name: '`mmarco/v2/pt`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/v2/pt`
The `mmarco/v2/pt` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/v2/pt).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=8,841,823
This dataset is used by: [`mmarco_v2_pt_dev`](https://huggingface.co/datasets/irds/mmarco_v2_pt_dev), [`mmarco_v2_pt_train`](https://huggingface.co/datasets/irds/mmarco_v2_pt_train)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/mmarco_v2_pt', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
huggingartists/mf-doom | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/mf-doom"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 1.820143 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/263743633b6e58854e753b25dca6beab.430x430x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/mf-doom">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">MF DOOM</div>
<a href="https://genius.com/artists/mf-doom">
<div style="text-align: center; font-size: 14px;">@mf-doom</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/mf-doom).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/mf-doom")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|945| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/mf-doom")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
EgilKarlsen/Spirit_RoBERTa_Finetuned | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115650065.625
num_examples: 37500
- name: test
num_bytes: 38550020.0
num_examples: 12500
download_size: 211788382
dataset_size: 154200085.625
---
# Dataset Card for "Spirit_RoBERTa_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JudeChaer/adding | ---
license: mit
---
|
open-llm-leaderboard/details_Harshvir__LaMini-Neo-1.3B-Mental-Health_lora | ---
pretty_name: Evaluation run of Harshvir/LaMini-Neo-1.3B-Mental-Health_lora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Harshvir/LaMini-Neo-1.3B-Mental-Health_lora](https://huggingface.co/Harshvir/LaMini-Neo-1.3B-Mental-Health_lora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Harshvir__LaMini-Neo-1.3B-Mental-Health_lora\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T19:00:53.771505](https://huggingface.co/datasets/open-llm-leaderboard/details_Harshvir__LaMini-Neo-1.3B-Mental-Health_lora/blob/main/results_2023-09-17T19-00-53.771505.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.0,\n \"f1_stderr\": 0.0,\n \"\
acc\": 0.24585635359116023,\n \"acc_stderr\": 0.007025277661412099\n },\n\
\ \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n\
\ \"f1\": 0.0,\n \"f1_stderr\": 0.0\n },\n \"harness|gsm8k|5\"\
: {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.49171270718232046,\n \"acc_stderr\": 0.014050555322824197\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Harshvir/LaMini-Neo-1.3B-Mental-Health_lora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T19_00_53.771505
path:
- '**/details_harness|drop|3_2023-09-17T19-00-53.771505.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T19-00-53.771505.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T19_00_53.771505
path:
- '**/details_harness|gsm8k|5_2023-09-17T19-00-53.771505.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T19-00-53.771505.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T19_00_53.771505
path:
- '**/details_harness|winogrande|5_2023-09-17T19-00-53.771505.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T19-00-53.771505.parquet'
- config_name: results
data_files:
- split: 2023_09_17T19_00_53.771505
path:
- results_2023-09-17T19-00-53.771505.parquet
- split: latest
path:
- results_2023-09-17T19-00-53.771505.parquet
---
# Dataset Card for Evaluation run of Harshvir/LaMini-Neo-1.3B-Mental-Health_lora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Harshvir/LaMini-Neo-1.3B-Mental-Health_lora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Harshvir/LaMini-Neo-1.3B-Mental-Health_lora](https://huggingface.co/Harshvir/LaMini-Neo-1.3B-Mental-Health_lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Harshvir__LaMini-Neo-1.3B-Mental-Health_lora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T19:00:53.771505](https://huggingface.co/datasets/open-llm-leaderboard/details_Harshvir__LaMini-Neo-1.3B-Mental-Health_lora/blob/main/results_2023-09-17T19-00-53.771505.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0,
"f1_stderr": 0.0,
"acc": 0.24585635359116023,
"acc_stderr": 0.007025277661412099
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0,
"f1_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.49171270718232046,
"acc_stderr": 0.014050555322824197
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AbhiSmruti/sample_data | ---
dataset_info:
features:
- name: '### Human: Can you write a short introduction about the relevance of the
term "monopsony" in economics? Please use examples related to potential monopsonies
in the labour market and cite relevant research.### Assistant: "Monopsony" refers
to a market structure where there is only one buyer for a particular good or
service. In economics, this term is particularly relevant in the labor market,
where a monopsony employer has significant power over the wages and working
conditions of their employees. The presence of a monopsony can result in lower
wages and reduced employment opportunities for workers, as the employer has
little incentive to increase wages or provide better working conditions. Recent
research has identified potential monopsonies in industries such as retail and
fast food, where a few large companies control a significant portion of the
market (Bivens & Mishel, 2013). In these industries, workers often face low
wages, limited benefits, and reduced bargaining power, leading to a situation
where they are dependent on the employer for their livelihood. This dependence
can result in further suppression of wages and a decline in working conditions.
Overall, the concept of monopsony is essential to understanding the dynamics
of labor markets and the impact of market power on workers. Further research
is needed to understand the extent and impact of monopsonies on the economy
and to develop policies to address this issue. References: Bivens, J., & Mishel,
L. (2013). The Pay of Corporate Executives and Financial Professionals as Evidence
of Rents in Top 1 Percent Incomes. Journal of Economic Perspectives, 27(3),
57-78.### Human: Now explain it to a dog'
dtype: string
splits:
- name: train
num_bytes: 3197
num_examples: 1
- name: test
num_bytes: 2867
num_examples: 1
download_size: 64365
dataset_size: 6064
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
mask-distilled-one-sec-cv12/chunk_260 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 912104500
num_examples: 179125
download_size: 928741648
dataset_size: 912104500
---
# Dataset Card for "chunk_260"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hans | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- natural-language-inference
paperswithcode_id: hans
pretty_name: Heuristic Analysis for NLI Systems
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': non-entailment
- name: parse_premise
dtype: string
- name: parse_hypothesis
dtype: string
- name: binary_parse_premise
dtype: string
- name: binary_parse_hypothesis
dtype: string
- name: heuristic
dtype: string
- name: subcase
dtype: string
- name: template
dtype: string
config_name: plain_text
splits:
- name: train
num_bytes: 15916371
num_examples: 30000
- name: validation
num_bytes: 15893137
num_examples: 30000
download_size: 30947358
dataset_size: 31809508
---
# Dataset Card for "hans"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/tommccoy1/hans](https://github.com/tommccoy1/hans)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 30.94 MB
- **Size of the generated dataset:** 31.81 MB
- **Total amount of disk used:** 62.76 MB
### Dataset Summary
The HANS dataset is an NLI evaluation set that tests specific hypotheses about invalid heuristics that NLI models are likely to learn.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### plain_text
- **Size of downloaded dataset files:** 30.94 MB
- **Size of the generated dataset:** 31.81 MB
- **Total amount of disk used:** 62.76 MB
An example of 'train' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### plain_text
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `non-entailment` (1).
- `parse_premise`: a `string` feature.
- `parse_hypothesis`: a `string` feature.
- `binary_parse_premise`: a `string` feature.
- `binary_parse_hypothesis`: a `string` feature.
- `heuristic`: a `string` feature.
- `subcase`: a `string` feature.
- `template`: a `string` feature.
### Data Splits
| name |train|validation|
|----------|----:|---------:|
|plain_text|30000| 30000|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{DBLP:journals/corr/abs-1902-01007,
author = {R. Thomas McCoy and
Ellie Pavlick and
Tal Linzen},
title = {Right for the Wrong Reasons: Diagnosing Syntactic Heuristics in Natural
Language Inference},
journal = {CoRR},
volume = {abs/1902.01007},
year = {2019},
url = {http://arxiv.org/abs/1902.01007},
archivePrefix = {arXiv},
eprint = {1902.01007},
timestamp = {Tue, 21 May 2019 18:03:36 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1902-01007.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
### Contributions
Thanks to [@TevenLeScao](https://github.com/TevenLeScao), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
Tverous/misinfo-meta | ---
dataset_info:
features:
- name: uid
dtype: 'null'
- name: claim
dtype: 'null'
- name: main_text
dtype: 'null'
- name: image
dtype: 'null'
- name: video
dtype: 'null'
- name: audio
dtype: 'null'
- name: kg_embedding
dtype: 'null'
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 0
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "misinfo-meta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
phanvancongthanh/data_deduplicated_part04 | ---
dataset_info:
features:
- name: smiles
dtype: string
splits:
- name: train
num_bytes: 4854481434
num_examples: 103054258
download_size: 2391891371
dataset_size: 4854481434
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data_deduplicated_part04"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kye/thepilebooks3-gptneox-8k | ---
license: mit
---
|
aidenTim/instruct-python-llama2-20k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 424387944.3182734
num_examples: 209935
- name: test
num_bytes: 2021520.6817265982
num_examples: 1000
download_size: 217942961
dataset_size: 426409465.0
---
# Dataset Card for "instruct-python-llama2-20k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vikramrn/time_series_c | ---
dataset_info:
features:
- name: past_values
sequence: float64
- name: future_values
sequence: float64
- name: static_categorical_features
sequence: int64
- name: past_observed_mask
sequence: int64
- name: future_time_features
sequence:
sequence: int64
- name: past_time_features
sequence:
sequence: int64
splits:
- name: train
num_bytes: 410633508
num_examples: 179787
download_size: 5607196
dataset_size: 410633508
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Helsinki-NLP/opus_fiskmo | ---
annotations_creators:
- found
language_creators:
- found
language:
- fi
- sv
license:
- unknown
multilinguality:
- translation
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- translation
task_ids: []
pretty_name: OpusFiskmo
dataset_info:
config_name: fi-sv
features:
- name: translation
dtype:
translation:
languages:
- fi
- sv
splits:
- name: train
num_bytes: 326527146
num_examples: 2100001
download_size: 237248970
dataset_size: 326527146
configs:
- config_name: fi-sv
data_files:
- split: train
path: fi-sv/train-*
---
# Dataset Card for [opus_fiskmo]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**[fiskmo](http://opus.nlpl.eu/fiskmo.php)
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
fiskmo, a massive parallel corpus for Finnish and Swedish.
### Supported Tasks and Leaderboards
The underlying task is machine translation for language pair Finnish and Swedish.
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
J. Tiedemann, 2012, Parallel Data, Tools and Interfaces in OPUS. In Proceedings of the 8th International Conference on Language Resources and Evaluation (LREC 2012)
### Contributions
Thanks to [@spatil6](https://github.com/spatil6) for adding this dataset. |
krishi/tartan10 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 18507396.0
num_examples: 10
download_size: 18509661
dataset_size: 18507396.0
---
# Dataset Card for "tartan10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hortensia_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hortensia/オルテンシア (Fire Emblem)
This is the dataset of hortensia/オルテンシア (Fire Emblem), containing 156 images and their tags.
The core tags of this character are `pink_hair, bangs, pink_eyes, breasts, hair_rings, multicolored_hair, facial_mark, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 156 | 247.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hortensia_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 156 | 136.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hortensia_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 362 | 293.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hortensia_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 156 | 214.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hortensia_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 362 | 433.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hortensia_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hortensia_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, heart, looking_at_viewer, open_mouth, smile, solo, one_eye_closed, juliet_sleeves, ;d, cleavage, red_rose, upper_body, white_background, simple_background, blush, streaked_hair, medium_breasts, v_over_eye |
| 1 | 6 |  |  |  |  |  | 1girl, juliet_sleeves, looking_at_viewer, red_rose, smile, solo, simple_background, cleavage, heart_tattoo, medium_breasts, open_mouth, upper_body, green_background |
| 2 | 6 |  |  |  |  |  | 1girl, hair_bow, looking_at_viewer, smile, solo, choker, earrings, upper_body, heart_hands, long_sleeves, open_mouth, black_gloves, cleavage, polka_dot_bow, purple_eyes, red_jacket, simple_background, streaked_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | heart | looking_at_viewer | open_mouth | smile | solo | one_eye_closed | juliet_sleeves | ;d | cleavage | red_rose | upper_body | white_background | simple_background | blush | streaked_hair | medium_breasts | v_over_eye | heart_tattoo | green_background | hair_bow | choker | earrings | heart_hands | long_sleeves | black_gloves | polka_dot_bow | purple_eyes | red_jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------------|:--------|:-------|:-----------------|:-----------------|:-----|:-----------|:-----------|:-------------|:-------------------|:--------------------|:--------|:----------------|:-----------------|:-------------|:---------------|:-------------------|:-----------|:---------|:-----------|:--------------|:---------------|:---------------|:----------------|:--------------|:-------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | X | X | X | X | | X | | X | X | X | | X | | | X | | X | X | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | X | X | X | | | | X | | X | | X | | X | | | | | X | X | X | X | X | X | X | X | X |
|
Itau-Unibanco/FAQ_BACEN | ---
license: apache-2.0
task_categories:
- text-classification
- question-answering
language:
- pt
tags:
- finance
size_categories:
- 1K<n<10K
---
This dataset was used in the article: https://arxiv.org/abs/2311.11331 |
CyberHarem/clarisse_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of clarisse/クラリス (Granblue Fantasy)
This is the dataset of clarisse/クラリス (Granblue Fantasy), containing 500 images and their tags.
The core tags of this character are `long_hair, breasts, ribbon, ponytail, hair_ribbon, green_eyes, brown_hair, medium_breasts, orange_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 678.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clarisse_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 403.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clarisse_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1220 | 849.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clarisse_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 609.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clarisse_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1220 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/clarisse_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/clarisse_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 33 |  |  |  |  |  | 1girl, black_gloves, skirt, solo, cape, looking_at_viewer, smile, black_thighhighs, one_eye_closed, open_mouth, ;d, book, boots, blush, v_over_eye, sleeveless |
| 1 | 7 |  |  |  |  |  | 1girl, :d, black_gloves, black_ribbon, black_thighhighs, cape, looking_at_viewer, open_mouth, sleeveless, solo, sideboob, simple_background, very_long_hair, white_background, blush, red_skirt, test_tube, black_footwear, high_heel_boots, holding_book, knee_boots, open_book |
| 2 | 26 |  |  |  |  |  | cape, 1girl, black_gloves, christmas, santa_hat, solo, navel, black_thighhighs, blush, fur_trim, looking_at_viewer, cleavage, open_mouth, santa_bikini, one_eye_closed, boots, red_bikini, very_long_hair, :d |
| 3 | 5 |  |  |  |  |  | 1girl, blush, detached_sleeves, hairband, long_sleeves, looking_at_viewer, solo, white_background, white_shirt, bare_shoulders, closed_mouth, red_skirt, simple_background, very_long_hair, bow, low_twintails, sleeveless_shirt, white_sweater, plaid_skirt, red_ribbon, sleeves_past_wrists, smile, thighhighs, turtleneck |
| 4 | 9 |  |  |  |  |  | 1girl, blush, hairband, looking_at_viewer, red_skirt, solo, bare_shoulders, detached_sleeves, plaid_skirt, valentine, very_long_hair, aqua_eyes, holding, long_sleeves, thighhighs, white_background, apron, scarf, simple_background, smile, closed_mouth, gift, heart-shaped_box, large_breasts, twintails, white_shirt |
| 5 | 9 |  |  |  |  |  | 1girl, bare_shoulders, blush, looking_at_viewer, red_bikini, solo, hair_flower, cleavage, navel, very_long_hair, smile, beach, bracelet, cloud, collarbone, frilled_bikini, open_mouth, outdoors, sarong, sky |
| 6 | 6 |  |  |  |  |  | 1girl, bare_shoulders, blush, looking_at_viewer, solo, black_thighhighs, turtleneck, sleeveless_shirt, very_long_hair, white_panties, armpits, closed_mouth, on_back, side-tie_panties, skirt_lift, smile, sweater, swept_bangs, white_shirt |
| 7 | 23 |  |  |  |  |  | 1girl, hair_bow, solo, looking_at_viewer, official_alternate_costume, hair_flower, blush, chest_harness, white_dress, black_gloves, white_bow, elbow_gloves, white_background, bare_shoulders, earrings, red_rose, smile, pantyhose |
| 8 | 21 |  |  |  |  |  | 1girl, blush, hetero, solo_focus, 1boy, penis, nipples, open_mouth, pussy, thighhighs, large_breasts, bar_censor, sex, vaginal, sweat, spread_legs, cum, female_pubic_hair, gloves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | skirt | solo | cape | looking_at_viewer | smile | black_thighhighs | one_eye_closed | open_mouth | ;d | book | boots | blush | v_over_eye | sleeveless | :d | black_ribbon | sideboob | simple_background | very_long_hair | white_background | red_skirt | test_tube | black_footwear | high_heel_boots | holding_book | knee_boots | open_book | christmas | santa_hat | navel | fur_trim | cleavage | santa_bikini | red_bikini | detached_sleeves | hairband | long_sleeves | white_shirt | bare_shoulders | closed_mouth | bow | low_twintails | sleeveless_shirt | white_sweater | plaid_skirt | red_ribbon | sleeves_past_wrists | thighhighs | turtleneck | valentine | aqua_eyes | holding | apron | scarf | gift | heart-shaped_box | large_breasts | twintails | hair_flower | beach | bracelet | cloud | collarbone | frilled_bikini | outdoors | sarong | sky | white_panties | armpits | on_back | side-tie_panties | skirt_lift | sweater | swept_bangs | hair_bow | official_alternate_costume | chest_harness | white_dress | white_bow | elbow_gloves | earrings | red_rose | pantyhose | hetero | solo_focus | 1boy | penis | nipples | pussy | bar_censor | sex | vaginal | sweat | spread_legs | cum | female_pubic_hair | gloves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------|:-------|:-------|:--------------------|:--------|:-------------------|:-----------------|:-------------|:-----|:-------|:--------|:--------|:-------------|:-------------|:-----|:---------------|:-----------|:--------------------|:-----------------|:-------------------|:------------|:------------|:-----------------|:------------------|:---------------|:-------------|:------------|:------------|:------------|:--------|:-----------|:-----------|:---------------|:-------------|:-------------------|:-----------|:---------------|:--------------|:-----------------|:---------------|:------|:----------------|:-------------------|:----------------|:--------------|:-------------|:----------------------|:-------------|:-------------|:------------|:------------|:----------|:--------|:--------|:-------|:-------------------|:----------------|:------------|:--------------|:--------|:-----------|:--------|:-------------|:-----------------|:-----------|:---------|:------|:----------------|:----------|:----------|:-------------------|:-------------|:----------|:--------------|:-----------|:-----------------------------|:----------------|:--------------|:------------|:---------------|:-----------|:-----------|:------------|:---------|:-------------|:-------|:--------|:----------|:--------|:-------------|:------|:----------|:--------|:--------------|:------|:--------------------|:---------|
| 0 | 33 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | X | X | | X | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 26 |  |  |  |  |  | X | X | | X | X | X | | X | X | X | | | X | X | | | X | | | | X | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | | X | X | | | | | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | | X | | X | X | | | | | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | | X | | X | X | | | X | | | | X | | | | | | | X | | | | | | | | | | | X | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | X | | X | X | X | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 23 |  |  |  |  |  | X | X | | X | | X | X | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 8 | 21 |  |  |  |  |  | X | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
bdotloh/empathetic-dialogues-contexts | ---
annotations_creators:
- crowdsourced
language:
- en
multilinguality:
- monolingual
task_categories:
- text-classification
---
# Dataset Description
This is a dataset of emotional contexts that was retrieved from the original EmpatheticDialogues (ED) dataset. Respondents were asked to describe an event that was associated with a particular emotion label (i.e. p(event|emotion).
There are 32 emotion labels in total.
There are 19209, 2756, and 2542 instances of emotional descriptions in the train, valid, and test set, respectively. |
yankscally/midiset | ---
license: unknown
---
this is my first dataset made from 80k VGM midi tracks found on archive.org
|
roa7n/maltaomics_dataset_clustered | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: seq
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 429909
num_examples: 1600
- name: test
num_bytes: 106032
num_examples: 400
download_size: 0
dataset_size: 535941
---
# Dataset Card for "maltaomics_dataset_clustered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
logosrhema/ug-examine-data | ---
license: mit
---
|
marceloslo/2016 | ---
dataset_info:
features:
- name: timestamp
dtype: timestamp[s]
- name: url
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 25384471256.739643
num_examples: 10000000
download_size: 16163891868
dataset_size: 25384471256.739643
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/find_marker_after_sent_train_200_eval_40 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 1445507
num_examples: 1254
- name: validation
num_bytes: 214957
num_examples: 198
download_size: 351050
dataset_size: 1660464
---
# Dataset Card for "find_marker_after_sent_train_200_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kejian/tuluv2_sft_mixture_no_science | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 1210917065.806392
num_examples: 318686
download_size: 0
dataset_size: 1210917065.806392
---
# Dataset Card for "tuluv2_sft_mixture_no_science"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
owkin/camelyon16-features | ---
dataset_info:
features:
- name: features
sequence:
sequence: float32
- name: label
dtype: int64
splits:
- name: Phikon_test
num_bytes: 401342744
num_examples: 130
- name: Phikon_train
num_bytes: 808932620
num_examples: 269
download_size: 1210840794
dataset_size: 1210275364
configs:
- config_name: default
data_files:
- split: Phikon_test
path: data/Phikon_test-*
- split: Phikon_train
path: data/Phikon_train-*
license: other
task_categories:
- feature-extraction
- image-classification
language:
- en
tags:
- biology
- medical
- cancer
pretty_name: Camelyon16 Features
size_categories:
- n<1K
---
# Dataset Card for Camelyon16-features
### Dataset Summary
The Camelyon16 dataset is a very popular benchmark dataset used in the field of cancer classification.

The dataset we've uploaded here is the result of features extracted from the Camelyon16 dataset using the Phikon model, which is also openly available on Hugging Face.
## Dataset Creation
### Initial Data Collection and Normalization
The initial collection of the Camelyon16 Whole Slide Images is credited to:
Radboud University Medical Center (Nijmegen, the Netherlands),
University Medical Center Utrecht (Utrecht, the Netherlands).
### Licensing Information
This dataset is under [Owkin non-commercial license](https://github.com/owkin/HistoSSLscaling/blob/main/LICENSE.txt).
### Citation Information
Owkin claims no ownership of this dataset. This is simply an extraction of features from the original dataset.
[Link to original dataset](https://camelyon16.grand-challenge.org/) [Link to original paper](https://jamanetwork.com/journals/jama/fullarticle/2665774) |
Deojoandco/capstone_fromgpt_without_gold_v7 | ---
dataset_info:
features:
- name: dialog_id
dtype: int64
- name: dialogue
dtype: string
- name: summary
dtype: string
- name: gold_tags
dtype: string
- name: gpt_success
dtype: bool
- name: gpt_response
dtype: string
- name: gold_tags_tokens_count
dtype: int64
- name: GPT_TAGS_FOUND
dtype: bool
- name: gpt_output_tags
dtype: string
- name: gpt_output_tag_tokens_count
dtype: int64
- name: GPT_MI_FOUND
dtype: bool
- name: gpt_tags_token_count
dtype: int64
- name: gpt_tags
dtype: string
- name: tag_token_count_match
dtype: bool
splits:
- name: test
num_bytes: 21303
num_examples: 12
download_size: 23320
dataset_size: 21303
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "capstone_fromgpt_without_gold_v7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_SJ-Donald__SJ-SOLAR-10.7b-DPO | ---
pretty_name: Evaluation run of SJ-Donald/SJ-SOLAR-10.7b-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SJ-Donald/SJ-SOLAR-10.7b-DPO](https://huggingface.co/SJ-Donald/SJ-SOLAR-10.7b-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SJ-Donald__SJ-SOLAR-10.7b-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T05:53:20.241050](https://huggingface.co/datasets/open-llm-leaderboard/details_SJ-Donald__SJ-SOLAR-10.7b-DPO/blob/main/results_2024-01-25T05-53-20.241050.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6694201238242145,\n\
\ \"acc_stderr\": 0.03145425883361444,\n \"acc_norm\": 0.6709590638465028,\n\
\ \"acc_norm_stderr\": 0.03209348907350449,\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.0174953044731879,\n \"mc2\": 0.6774426022949598,\n\
\ \"mc2_stderr\": 0.014870145786575549\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6535836177474402,\n \"acc_stderr\": 0.013905011180063232,\n\
\ \"acc_norm\": 0.6825938566552902,\n \"acc_norm_stderr\": 0.013602239088038167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6835291774546903,\n\
\ \"acc_stderr\": 0.0046414842733351,\n \"acc_norm\": 0.8695478988249352,\n\
\ \"acc_norm_stderr\": 0.003361118395452385\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810535,\n\
\ \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810535\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\
\ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.6994219653179191,\n\
\ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947559,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947559\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.02573364199183898,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.02573364199183898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n\
\ \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n\
\ \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970565,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970565\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465715,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465715\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.029213549414372174,\n\
\ \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.029213549414372174\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8568807339449541,\n \"acc_stderr\": 0.01501446249716859,\n \"\
acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.01501446249716859\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6111111111111112,\n \"acc_stderr\": 0.033247089118091176,\n \"\
acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.033247089118091176\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8676470588235294,\n \"acc_stderr\": 0.023784297520918856,\n \"\
acc_norm\": 0.8676470588235294,\n \"acc_norm_stderr\": 0.023784297520918856\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746786,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746786\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.03021683101150878,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.03021683101150878\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7658959537572254,\n \"acc_stderr\": 0.022797110278071124,\n\
\ \"acc_norm\": 0.7658959537572254,\n \"acc_norm_stderr\": 0.022797110278071124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n\
\ \"acc_stderr\": 0.01660256461504994,\n \"acc_norm\": 0.4402234636871508,\n\
\ \"acc_norm_stderr\": 0.01660256461504994\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n\
\ \"acc_stderr\": 0.025122637608816657,\n \"acc_norm\": 0.7331189710610932,\n\
\ \"acc_norm_stderr\": 0.025122637608816657\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262196,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262196\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5390070921985816,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5032594524119948,\n\
\ \"acc_stderr\": 0.012769964760343318,\n \"acc_norm\": 0.5032594524119948,\n\
\ \"acc_norm_stderr\": 0.012769964760343318\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.026799562024887667,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.026799562024887667\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7026143790849673,\n \"acc_stderr\": 0.018492596536396955,\n \
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.018492596536396955\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7795918367346939,\n \"acc_stderr\": 0.02653704531214529,\n\
\ \"acc_norm\": 0.7795918367346939,\n \"acc_norm_stderr\": 0.02653704531214529\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.0174953044731879,\n \"mc2\": 0.6774426022949598,\n\
\ \"mc2_stderr\": 0.014870145786575549\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8421468034727704,\n \"acc_stderr\": 0.010247165248719763\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6209249431387415,\n \
\ \"acc_stderr\": 0.013363630295088361\n }\n}\n```"
repo_url: https://huggingface.co/SJ-Donald/SJ-SOLAR-10.7b-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|arc:challenge|25_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|gsm8k|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hellaswag|10_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-53-20.241050.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T05-53-20.241050.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- '**/details_harness|winogrande|5_2024-01-25T05-53-20.241050.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T05-53-20.241050.parquet'
- config_name: results
data_files:
- split: 2024_01_25T05_53_20.241050
path:
- results_2024-01-25T05-53-20.241050.parquet
- split: latest
path:
- results_2024-01-25T05-53-20.241050.parquet
---
# Dataset Card for Evaluation run of SJ-Donald/SJ-SOLAR-10.7b-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SJ-Donald/SJ-SOLAR-10.7b-DPO](https://huggingface.co/SJ-Donald/SJ-SOLAR-10.7b-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SJ-Donald__SJ-SOLAR-10.7b-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T05:53:20.241050](https://huggingface.co/datasets/open-llm-leaderboard/details_SJ-Donald__SJ-SOLAR-10.7b-DPO/blob/main/results_2024-01-25T05-53-20.241050.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6694201238242145,
"acc_stderr": 0.03145425883361444,
"acc_norm": 0.6709590638465028,
"acc_norm_stderr": 0.03209348907350449,
"mc1": 0.5152998776009792,
"mc1_stderr": 0.0174953044731879,
"mc2": 0.6774426022949598,
"mc2_stderr": 0.014870145786575549
},
"harness|arc:challenge|25": {
"acc": 0.6535836177474402,
"acc_stderr": 0.013905011180063232,
"acc_norm": 0.6825938566552902,
"acc_norm_stderr": 0.013602239088038167
},
"harness|hellaswag|10": {
"acc": 0.6835291774546903,
"acc_stderr": 0.0046414842733351,
"acc_norm": 0.8695478988249352,
"acc_norm_stderr": 0.003361118395452385
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810535,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810535
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947559,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947559
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.02573364199183898,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.02573364199183898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328972,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328972
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970565,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970565
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465715,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465715
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.029213549414372174,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.029213549414372174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.01501446249716859,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.01501446249716859
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8676470588235294,
"acc_stderr": 0.023784297520918856,
"acc_norm": 0.8676470588235294,
"acc_norm_stderr": 0.023784297520918856
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746786,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746786
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.03021683101150878,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.03021683101150878
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464076,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464076
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7658959537572254,
"acc_stderr": 0.022797110278071124,
"acc_norm": 0.7658959537572254,
"acc_norm_stderr": 0.022797110278071124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.01660256461504994,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.01660256461504994
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7331189710610932,
"acc_stderr": 0.025122637608816657,
"acc_norm": 0.7331189710610932,
"acc_norm_stderr": 0.025122637608816657
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262196,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262196
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5390070921985816,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.5390070921985816,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5032594524119948,
"acc_stderr": 0.012769964760343318,
"acc_norm": 0.5032594524119948,
"acc_norm_stderr": 0.012769964760343318
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.026799562024887667,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.026799562024887667
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.018492596536396955,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.018492596536396955
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.02653704531214529,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.02653704531214529
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5152998776009792,
"mc1_stderr": 0.0174953044731879,
"mc2": 0.6774426022949598,
"mc2_stderr": 0.014870145786575549
},
"harness|winogrande|5": {
"acc": 0.8421468034727704,
"acc_stderr": 0.010247165248719763
},
"harness|gsm8k|5": {
"acc": 0.6209249431387415,
"acc_stderr": 0.013363630295088361
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Badgids__Gonzo-Code-7B | ---
pretty_name: Evaluation run of Badgids/Gonzo-Code-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Badgids/Gonzo-Code-7B](https://huggingface.co/Badgids/Gonzo-Code-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Badgids__Gonzo-Code-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-02T19:37:49.805412](https://huggingface.co/datasets/open-llm-leaderboard/details_Badgids__Gonzo-Code-7B/blob/main/results_2024-03-02T19-37-49.805412.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6277875645270296,\n\
\ \"acc_stderr\": 0.0325603894720076,\n \"acc_norm\": 0.6309705779726885,\n\
\ \"acc_norm_stderr\": 0.03320985289346225,\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5670183360220316,\n\
\ \"mc2_stderr\": 0.015744860563394754\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5802047781569966,\n \"acc_stderr\": 0.014422181226303026,\n\
\ \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.01423587248790987\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6501692889862577,\n\
\ \"acc_stderr\": 0.004759416464201141,\n \"acc_norm\": 0.8366859191396137,\n\
\ \"acc_norm_stderr\": 0.003688965231733525\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695248,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695248\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.03252909619613197,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.03252909619613197\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159788,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7419354838709677,\n \"acc_stderr\": 0.02489246917246283,\n \"\
acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.02489246917246283\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593563,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593563\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886783,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886783\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045804,\n \"\
acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.02886743144984932,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02886743144984932\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899136,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.02475241196091721,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.02475241196091721\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n\
\ \"acc_stderr\": 0.016175692013381968,\n \"acc_norm\": 0.37318435754189944,\n\
\ \"acc_norm_stderr\": 0.016175692013381968\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825365,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825365\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5670183360220316,\n\
\ \"mc2_stderr\": 0.015744860563394754\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091088\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5140257771038665,\n \
\ \"acc_stderr\": 0.013767064940239283\n }\n}\n```"
repo_url: https://huggingface.co/Badgids/Gonzo-Code-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|arc:challenge|25_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|gsm8k|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hellaswag|10_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T19-37-49.805412.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T19-37-49.805412.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- '**/details_harness|winogrande|5_2024-03-02T19-37-49.805412.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-02T19-37-49.805412.parquet'
- config_name: results
data_files:
- split: 2024_03_02T19_37_49.805412
path:
- results_2024-03-02T19-37-49.805412.parquet
- split: latest
path:
- results_2024-03-02T19-37-49.805412.parquet
---
# Dataset Card for Evaluation run of Badgids/Gonzo-Code-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Badgids/Gonzo-Code-7B](https://huggingface.co/Badgids/Gonzo-Code-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Badgids__Gonzo-Code-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-02T19:37:49.805412](https://huggingface.co/datasets/open-llm-leaderboard/details_Badgids__Gonzo-Code-7B/blob/main/results_2024-03-02T19-37-49.805412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6277875645270296,
"acc_stderr": 0.0325603894720076,
"acc_norm": 0.6309705779726885,
"acc_norm_stderr": 0.03320985289346225,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5670183360220316,
"mc2_stderr": 0.015744860563394754
},
"harness|arc:challenge|25": {
"acc": 0.5802047781569966,
"acc_stderr": 0.014422181226303026,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.01423587248790987
},
"harness|hellaswag|10": {
"acc": 0.6501692889862577,
"acc_stderr": 0.004759416464201141,
"acc_norm": 0.8366859191396137,
"acc_norm_stderr": 0.003688965231733525
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159788,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.02489246917246283,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.02489246917246283
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593563,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593563
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886783,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886783
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.01697028909045804,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.01697028909045804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02886743144984932,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02886743144984932
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899136,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.016175692013381968,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.016175692013381968
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825365,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5670183360220316,
"mc2_stderr": 0.015744860563394754
},
"harness|winogrande|5": {
"acc": 0.7726913970007893,
"acc_stderr": 0.011778612167091088
},
"harness|gsm8k|5": {
"acc": 0.5140257771038665,
"acc_stderr": 0.013767064940239283
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sanchit-gandhi/concatenated-train-set-label-length-256-conditioned | ---
dataset_info:
config_name: train
features:
- name: id
dtype: string
- name: text
dtype: string
- name: input_features
dtype: image
- name: condition_on_prev
sequence: int64
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 4009681656693.0
num_examples: 2607550
download_size: 2515115685452
dataset_size: 4009681656693.0
configs:
- config_name: train
data_files:
- split: train
path: train/train-*
---
|
gguichard/wsd_myriade_synth_data_gpt4turbo_v2 | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: wn_sens
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2222567
num_examples: 3391
download_size: 473896
dataset_size: 2222567
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wsd_myriade_synth_data_gpt4turbo_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alexandrainst/da-wit | ---
pretty_name: Danish WIT
language:
- da
license:
- cc-by-sa-4.0
size_categories:
- 100K<n<1M
source_datasets:
- wikimedia/wit_base
task_categories:
- image-to-text
- zero-shot-image-classification
- feature-extraction
task_ids:
- image-captioning
---
# Dataset Card for Danish WIT
## Dataset Description
- **Repository:** <https://gist.github.com/saattrupdan/bb6c9c52d9f4b35258db2b2456d31224>
- **Point of Contact:** [Dan Saattrup Nielsen](mailto:dan.nielsen@alexandra.dk)
- **Size of downloaded dataset files:** 7.5 GB
- **Size of the generated dataset:** 7.8 GB
- **Total amount of disk used:** 15.3 GB
### Dataset Summary
Google presented the Wikipedia Image Text (WIT) dataset in [July
2021](https://dl.acm.org/doi/abs/10.1145/3404835.3463257), a dataset which contains
scraped images from Wikipedia along with their descriptions. WikiMedia released
WIT-Base in [September
2021](https://techblog.wikimedia.org/2021/09/09/the-wikipedia-image-caption-matching-challenge-and-a-huge-release-of-image-data-for-research/),
being a modified version of WIT where they have removed the images with empty
"reference descriptions", as well as removing images where a person's face covers more
than 10% of the image surface, along with inappropriate images that are candidate for
deletion. This dataset is the Danish portion of the WIT-Base dataset, consisting of
roughly 160,000 images with associated Danish descriptions. We release the dataset
under the [CC BY-SA 4.0 license](https://creativecommons.org/licenses/by-sa/4.0/), in
accordance with WIT-Base's [identical
license](https://huggingface.co/datasets/wikimedia/wit_base#licensing-information).
### Supported Tasks and Leaderboards
Training machine learning models for caption generation, zero-shot image classification
and text-image search are the intended tasks for this dataset. No leaderboard is active
at this point.
### Languages
The dataset is available in Danish (`da`).
## Dataset Structure
### Data Instances
- **Size of downloaded dataset files:** 7.5 GB
- **Size of the generated dataset:** 7.8 GB
- **Total amount of disk used:** 15.3 GB
An example from the `train` split looks as follows.
```
{
"image": [PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=300x409 at 0x7FE4384E2190],
"image_url": "https://upload.wikimedia.org/wikipedia/commons/4/45/Bispen_-_inside.jpg",
"embedding": [2.8568285, 2.9562542, 0.33794892, 8.753725, ...],
"metadata_url": "http://commons.wikimedia.org/wiki/File:Bispen_-_inside.jpg",
"original_height": 3161,
"original_width": 2316,
"mime_type": "image/jpeg",
"caption_attribution_description": "Kulturhuset Bispen set indefra. Biblioteket er til venstre",
"page_url": "https://da.wikipedia.org/wiki/Bispen",
"attribution_passes_lang_id": True,
"caption_alt_text_description": None,
"caption_reference_description": "Bispen set indefra fra 1. sal, hvor ....",
"caption_title_and_reference_description": "Bispen [SEP] Bispen set indefra ...",
"context_page_description": "Bispen er navnet på det offentlige kulturhus i ...",
"context_section_description": "Bispen er navnet på det offentlige kulturhus i ...",
"hierarchical_section_title": "Bispen",
"is_main_image": True,
"page_changed_recently": True,
"page_title": "Bispen",
"section_title": None
}
```
### Data Fields
The data fields are the same among all splits.
- `image`: an `Image` feature.
- `image_url`: a `str` feature.
- `embedding`: a `list` feature.
- `metadata_url`: a `str` feature.
- `original_height`: an `int` or `NaN` feature.
- `original_width`: an `int` or `NaN` feature.
- `mime_type`: a `str` or `None` feature.
- `caption_attribution_description`: a `str` or `None` feature.
- `page_url`: a `str` feature.
- `attribution_passes_lang_id`: a `bool` or `None` feature.
- `caption_alt_text_description`: a `str` or `None` feature.
- `caption_reference_description`: a `str` or `None` feature.
- `caption_title_and_reference_description`: a `str` or `None` feature.
- `context_page_description`: a `str` or `None` feature.
- `context_section_description`: a `str` or `None` feature.
- `hierarchical_section_title`: a `str` feature.
- `is_main_image`: a `bool` or `None` feature.
- `page_changed_recently`: a `bool` or `None` feature.
- `page_title`: a `str` feature.
- `section_title`: a `str` or `None` feature.
### Data Splits
Roughly 2.60% of the WIT-Base dataset comes from the Danish Wikipedia. We have split
the resulting 168,740 samples into a training set, validation set and testing set of
the following sizes:
| split | samples |
|---------|--------:|
| train | 167,460 |
| val | 256 |
| test | 1,024 |
## Dataset Creation
### Curation Rationale
It is quite cumbersome to extract the Danish portion of the WIT-Base dataset,
especially as the dataset takes up 333 GB of disk space, so the curation of Danish-WIT
is purely to make it easier to work with the Danish portion of it.
### Source Data
The original data was collected from WikiMedia's
[WIT-Base](https://huggingface.co/datasets/wikimedia/wit_base) dataset, which in turn
comes from Google's [WIT](https://huggingface.co/datasets/google/wit) dataset.
## Additional Information
### Dataset Curators
[Dan Saattrup Nielsen](https://saattrupdan.github.io/) from the [The Alexandra
Institute](https://alexandra.dk/) curated this dataset.
### Licensing Information
The dataset is licensed under the [CC BY-SA 4.0
license](https://creativecommons.org/licenses/by-sa/4.0/).
|
harrywang/crypto-coven | ---
license: mit
---
This dataset contains information about the 9761 witches from the Crypto Coven NFT project (https://www.cryptocoven.xyz/) collected using OpenSea API.
The folder 'witch_images' includes the images of each witch in three different sizes.
I briefly describe the data in the `witches.csv` below:
- `id`: the id of the witch
- `num_sales`: number of sales in the past (till 4/21/2022 the day I collected the data)
- `name`: the name of the witch
- `description`: the description of the witch
- `external_link`: the link to the official page for the witch
- `permalink`: the OpenSea link for the witch
- `token_metadata`: the metadata JSON file about the witch
- `token_id`: the token_id of the NFT
- `owner.user.username`: the user name of the current owner
- `owner.address`: the wallet address of the current owner
- `last_sale.total_price`: the price of the last sale in gwei. Note that the unit here is gwei (giga and wei) and 1 ether = 1 billion gwei (18 zeros)
- `last_sale.payment_token.usd_price`: the USD price of 1 ether (ETH) for the last sale
- `last_sale.transaction.timestamp`: the timestamp of the last sale
- `properties`: there are 32 properties of each witch covering the different design elements of each witch, such as Skin Tone, Eyebrows, Body Shape, etc.
`witches_full.csv` is the full data provided by the OpenSea API, such as https://api.opensea.io/api/v1/asset/0x5180db8f5c931aae63c74266b211f580155ecac8/50. I just simply flattened the JSON returned by the API. |
GugaKunkel/Breaking_Bad_Scenes_LLM | ---
license: mit
---
|
SpellcraftAI/wordnet | ---
license: mit
---
This dataset contains the embeddings and cross cosine similarity for ~76k English nouns, verbs, and adjectives from [Princeton's WordNet database](https://wordnet.princeton.edu/). |
eliebak/test-phi2-gen-dataset | ---
task_catageories :
- question-answering
- text-generation
multilinguality:
- monolingual
dataset_info:
features:
- name: prompt_id
dtype: int64
- name: system_instruction
dtype: string
- name: question
dtype: string
- name: output_instruction
dtype: string
- name: answer
dtype: string
- name: score
sequence: float32
splits:
- name: human
num_bytes: 4053
num_examples: 3
- name: ai
num_bytes: 2470
num_examples: 3
- name: hub
num_bytes: 5512
num_examples: 4
download_size: 38291
dataset_size: 12035
configs:
- config_name: default
data_files:
- split: human
path: data/human-*
- split: ai
path: data/ai-*
- split: hub
path: data/hub-*
---
# Small dataset generated by phi-2 for testing purposes
This dataset is generated by phi-2 using three different methods for generating prompts, each with a specific task:
* Human Generated : Assessing code
* AI Generated : Assessing math step-by-step reasoning
* From the 🤗 Hub: Assessing helpfulness
**Authors:** Elie Bakouch
The goal of this dataset is to demonstrate that with a small language model (even if not aligned) and a robust reward model, we can generate a reasonably good dataset for fine-tuning on specific tasks.
**This dataset is only a proof of concept and is not intended for use in production.**
We used [phi-2](https://huggingface.co/microsoft/phi-2) as the small language model and [Deberta](https://huggingface.co/OpenAssistant/reward-model-deberta-v3-large-v2) as the reward model.
## Generation process
- The `'human'` part of the dataset is written by the author.
- The `'ai'` is generated by [Mixtral]([Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) using the following prompt :
```
You are a ML engineer with 20 years of experiences, expert in alignment of large language model. You want to construct a robust dataset for doing SFT or RLHF to make LLM's good at math. Generate 5 prompt to give to your model. Don't generate the answer, only the math question.
Here is an example of what we want the prompt to look like :
"Q: Roger has 5 tennis balls. He buys 2 more cans of
tennis balls. Each can has 3 tennis balls. How many
tennis balls does he have now?
A: Roger started with 5 balls. 2 cans of 3 tennis balls
each is 6 tennis balls. 5 + 6 = 11. The answer is 11.
Q: The cafeteria had 23 apples. If they used 20 to
make lunch and bought 6 more, how many apples
do they have?"
```
- The `'hub'` is sourced from the from the [hhh_alignment](https://huggingface.co/datasets/HuggingFaceH4/hhh_alignment) dataset.
# How to use
```python
from datasets import load_dataset
dataset = load_dataset("eliebak/test-phi2-gen-dataset")
```
|
trondizzy/Tatoeba_v2022_03_03 | ---
license: cc
task_categories:
- translation
language:
- uk
- en
size_categories:
- 100K<n<1M
--- |
SyncGlob/chatgpt_prompts | ---
license: cc
tags:
- Chatgpt
- gpt
- prompts
--- |
AdapterOcean/med_alpaca_standardized_cluster_26_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 19175506
num_examples: 11932
download_size: 10005227
dataset_size: 19175506
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_26_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jilp00/YouToks-Instruct-Quantum-Physics-II | ---
dataset_info:
features:
- name: text
dtype: string
- name: token_count
dtype: int64
- name: response
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2003113
num_examples: 1042
download_size: 981109
dataset_size: 2003113
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
azevedopedroc/canaliasia | ---
license: openrail
---
|
SUSTech/valley_instruct_65k | ---
dataset_info:
features:
- name: id
dtype: string
- name: v_id
dtype: string
- name: video
dtype: string
- name: source
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: video_url
dtype: string
splits:
- name: train
num_bytes: 85450295
num_examples: 64690
download_size: 34934388
dataset_size: 85450295
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_pansophic__new_model_test | ---
pretty_name: Evaluation run of pansophic/new_model_test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pansophic/new_model_test](https://huggingface.co/pansophic/new_model_test) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pansophic__new_model_test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T18:43:28.582907](https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__new_model_test/blob/main/results_2024-02-29T18-43-28.582907.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4639971505070735,\n\
\ \"acc_stderr\": 0.034656622817217424,\n \"acc_norm\": 0.4660219805936662,\n\
\ \"acc_norm_stderr\": 0.03536970907083775,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059615,\n \"mc2\": 0.5124518749351841,\n\
\ \"mc2_stderr\": 0.014906275227740079\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49402730375426623,\n \"acc_stderr\": 0.014610348300255795,\n\
\ \"acc_norm\": 0.5255972696245734,\n \"acc_norm_stderr\": 0.014592230885298962\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5337582154949213,\n\
\ \"acc_stderr\": 0.004978395540514382,\n \"acc_norm\": 0.7365066719776937,\n\
\ \"acc_norm_stderr\": 0.004396273173717462\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4981132075471698,\n \"acc_stderr\": 0.030772653642075664,\n\
\ \"acc_norm\": 0.4981132075471698,\n \"acc_norm_stderr\": 0.030772653642075664\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.02418049716437689,\n \"acc_norm\"\
: 0.328042328042328,\n \"acc_norm_stderr\": 0.02418049716437689\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5258064516129032,\n \"acc_stderr\": 0.02840609505765332,\n \"\
acc_norm\": 0.5258064516129032,\n \"acc_norm_stderr\": 0.02840609505765332\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165635,\n\
\ \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165635\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5757575757575758,\n \"acc_stderr\": 0.035212249088415845,\n \"\
acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.035212249088415845\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6269430051813472,\n \"acc_stderr\": 0.03490205592048574,\n\
\ \"acc_norm\": 0.6269430051813472,\n \"acc_norm_stderr\": 0.03490205592048574\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.40512820512820513,\n \"acc_stderr\": 0.024890471769938145,\n\
\ \"acc_norm\": 0.40512820512820513,\n \"acc_norm_stderr\": 0.024890471769938145\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587193,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03214536859788639,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03214536859788639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6128440366972477,\n\
\ \"acc_stderr\": 0.02088423199264345,\n \"acc_norm\": 0.6128440366972477,\n\
\ \"acc_norm_stderr\": 0.02088423199264345\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298804,\n\
\ \"acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298804\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.553921568627451,\n \"acc_stderr\": 0.03488845451304974,\n \"acc_norm\"\
: 0.553921568627451,\n \"acc_norm_stderr\": 0.03488845451304974\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.6118143459915611,\n \"acc_stderr\": 0.031722950043323296,\n \"\
acc_norm\": 0.6118143459915611,\n \"acc_norm_stderr\": 0.031722950043323296\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5067264573991032,\n\
\ \"acc_stderr\": 0.03355476596234353,\n \"acc_norm\": 0.5067264573991032,\n\
\ \"acc_norm_stderr\": 0.03355476596234353\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4122137404580153,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.4122137404580153,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5214723926380368,\n \"acc_stderr\": 0.0392474687675113,\n\
\ \"acc_norm\": 0.5214723926380368,\n \"acc_norm_stderr\": 0.0392474687675113\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5436893203883495,\n \"acc_stderr\": 0.049318019942204146,\n\
\ \"acc_norm\": 0.5436893203883495,\n \"acc_norm_stderr\": 0.049318019942204146\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7350427350427351,\n\
\ \"acc_stderr\": 0.028911208802749472,\n \"acc_norm\": 0.7350427350427351,\n\
\ \"acc_norm_stderr\": 0.028911208802749472\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6015325670498084,\n\
\ \"acc_stderr\": 0.017507438602777415,\n \"acc_norm\": 0.6015325670498084,\n\
\ \"acc_norm_stderr\": 0.017507438602777415\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4508670520231214,\n \"acc_stderr\": 0.02678881193156276,\n\
\ \"acc_norm\": 0.4508670520231214,\n \"acc_norm_stderr\": 0.02678881193156276\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925308,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925308\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.028491993586171563,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.028491993586171563\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.48231511254019294,\n\
\ \"acc_stderr\": 0.02838032284907713,\n \"acc_norm\": 0.48231511254019294,\n\
\ \"acc_norm_stderr\": 0.02838032284907713\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49691358024691357,\n \"acc_stderr\": 0.027820214158594377,\n\
\ \"acc_norm\": 0.49691358024691357,\n \"acc_norm_stderr\": 0.027820214158594377\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.33687943262411346,\n \"acc_stderr\": 0.02819553487396673,\n \
\ \"acc_norm\": 0.33687943262411346,\n \"acc_norm_stderr\": 0.02819553487396673\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3539765319426336,\n\
\ \"acc_stderr\": 0.012213504731731644,\n \"acc_norm\": 0.3539765319426336,\n\
\ \"acc_norm_stderr\": 0.012213504731731644\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3602941176470588,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.3602941176470588,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4444444444444444,\n \"acc_stderr\": 0.020102583895887184,\n \
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.020102583895887184\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4489795918367347,\n \"acc_stderr\": 0.03184213866687579,\n\
\ \"acc_norm\": 0.4489795918367347,\n \"acc_norm_stderr\": 0.03184213866687579\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n\
\ \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n\
\ \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6081871345029239,\n \"acc_stderr\": 0.037439798259263996,\n\
\ \"acc_norm\": 0.6081871345029239,\n \"acc_norm_stderr\": 0.037439798259263996\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059615,\n \"mc2\": 0.5124518749351841,\n\
\ \"mc2_stderr\": 0.014906275227740079\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6637726913970008,\n \"acc_stderr\": 0.013277286593993454\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.37907505686125853,\n \
\ \"acc_stderr\": 0.013363630295088351\n }\n}\n```"
repo_url: https://huggingface.co/pansophic/new_model_test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|arc:challenge|25_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|gsm8k|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hellaswag|10_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-43-28.582907.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T18-43-28.582907.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- '**/details_harness|winogrande|5_2024-02-29T18-43-28.582907.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T18-43-28.582907.parquet'
- config_name: results
data_files:
- split: 2024_02_29T18_43_28.582907
path:
- results_2024-02-29T18-43-28.582907.parquet
- split: latest
path:
- results_2024-02-29T18-43-28.582907.parquet
---
# Dataset Card for Evaluation run of pansophic/new_model_test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [pansophic/new_model_test](https://huggingface.co/pansophic/new_model_test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pansophic__new_model_test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T18:43:28.582907](https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__new_model_test/blob/main/results_2024-02-29T18-43-28.582907.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4639971505070735,
"acc_stderr": 0.034656622817217424,
"acc_norm": 0.4660219805936662,
"acc_norm_stderr": 0.03536970907083775,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059615,
"mc2": 0.5124518749351841,
"mc2_stderr": 0.014906275227740079
},
"harness|arc:challenge|25": {
"acc": 0.49402730375426623,
"acc_stderr": 0.014610348300255795,
"acc_norm": 0.5255972696245734,
"acc_norm_stderr": 0.014592230885298962
},
"harness|hellaswag|10": {
"acc": 0.5337582154949213,
"acc_stderr": 0.004978395540514382,
"acc_norm": 0.7365066719776937,
"acc_norm_stderr": 0.004396273173717462
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.45394736842105265,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.45394736842105265,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4981132075471698,
"acc_stderr": 0.030772653642075664,
"acc_norm": 0.4981132075471698,
"acc_norm_stderr": 0.030772653642075664
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.02418049716437689,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.02418049716437689
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5258064516129032,
"acc_stderr": 0.02840609505765332,
"acc_norm": 0.5258064516129032,
"acc_norm_stderr": 0.02840609505765332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.03793713171165635,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.03793713171165635
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.035212249088415845,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.035212249088415845
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6269430051813472,
"acc_stderr": 0.03490205592048574,
"acc_norm": 0.6269430051813472,
"acc_norm_stderr": 0.03490205592048574
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.40512820512820513,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.40512820512820513,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587193,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03214536859788639,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03214536859788639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6128440366972477,
"acc_stderr": 0.02088423199264345,
"acc_norm": 0.6128440366972477,
"acc_norm_stderr": 0.02088423199264345
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.031280390843298804,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.031280390843298804
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.03488845451304974,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.03488845451304974
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6118143459915611,
"acc_stderr": 0.031722950043323296,
"acc_norm": 0.6118143459915611,
"acc_norm_stderr": 0.031722950043323296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5067264573991032,
"acc_stderr": 0.03355476596234353,
"acc_norm": 0.5067264573991032,
"acc_norm_stderr": 0.03355476596234353
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4122137404580153,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.4122137404580153,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5214723926380368,
"acc_stderr": 0.0392474687675113,
"acc_norm": 0.5214723926380368,
"acc_norm_stderr": 0.0392474687675113
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.5436893203883495,
"acc_stderr": 0.049318019942204146,
"acc_norm": 0.5436893203883495,
"acc_norm_stderr": 0.049318019942204146
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7350427350427351,
"acc_stderr": 0.028911208802749472,
"acc_norm": 0.7350427350427351,
"acc_norm_stderr": 0.028911208802749472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6015325670498084,
"acc_stderr": 0.017507438602777415,
"acc_norm": 0.6015325670498084,
"acc_norm_stderr": 0.017507438602777415
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.02678881193156276,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.02678881193156276
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925308,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925308
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.028491993586171563,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.028491993586171563
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.48231511254019294,
"acc_stderr": 0.02838032284907713,
"acc_norm": 0.48231511254019294,
"acc_norm_stderr": 0.02838032284907713
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49691358024691357,
"acc_stderr": 0.027820214158594377,
"acc_norm": 0.49691358024691357,
"acc_norm_stderr": 0.027820214158594377
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.33687943262411346,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.33687943262411346,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3539765319426336,
"acc_stderr": 0.012213504731731644,
"acc_norm": 0.3539765319426336,
"acc_norm_stderr": 0.012213504731731644
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3602941176470588,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.3602941176470588,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.020102583895887184,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.020102583895887184
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4489795918367347,
"acc_stderr": 0.03184213866687579,
"acc_norm": 0.4489795918367347,
"acc_norm_stderr": 0.03184213866687579
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6218905472636815,
"acc_stderr": 0.034288678487786564,
"acc_norm": 0.6218905472636815,
"acc_norm_stderr": 0.034288678487786564
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6081871345029239,
"acc_stderr": 0.037439798259263996,
"acc_norm": 0.6081871345029239,
"acc_norm_stderr": 0.037439798259263996
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059615,
"mc2": 0.5124518749351841,
"mc2_stderr": 0.014906275227740079
},
"harness|winogrande|5": {
"acc": 0.6637726913970008,
"acc_stderr": 0.013277286593993454
},
"harness|gsm8k|5": {
"acc": 0.37907505686125853,
"acc_stderr": 0.013363630295088351
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_chargoddard__storytime-13b | ---
pretty_name: Evaluation run of chargoddard/storytime-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/storytime-13b](https://huggingface.co/chargoddard/storytime-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__storytime-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T01:48:37.638712](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__storytime-13b/blob/main/results_2023-10-29T01-48-37.638712.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.04456795302013423,\n\
\ \"em_stderr\": 0.002113250095417502,\n \"f1\": 0.14004299496644168,\n\
\ \"f1_stderr\": 0.002675066276875437,\n \"acc\": 0.41936202894613545,\n\
\ \"acc_stderr\": 0.009848887965633213\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.04456795302013423,\n \"em_stderr\": 0.002113250095417502,\n\
\ \"f1\": 0.14004299496644168,\n \"f1_stderr\": 0.002675066276875437\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08339651250947688,\n \
\ \"acc_stderr\": 0.007615650277106687\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n\
\ }\n}\n```"
repo_url: https://huggingface.co/chargoddard/storytime-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T01_48_37.638712
path:
- '**/details_harness|drop|3_2023-10-29T01-48-37.638712.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T01-48-37.638712.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T01_48_37.638712
path:
- '**/details_harness|gsm8k|5_2023-10-29T01-48-37.638712.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T01-48-37.638712.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T01_48_37.638712
path:
- '**/details_harness|winogrande|5_2023-10-29T01-48-37.638712.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T01-48-37.638712.parquet'
- config_name: results
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- results_2023-10-01T15-28-27.861711.parquet
- split: 2023_10_29T01_48_37.638712
path:
- results_2023-10-29T01-48-37.638712.parquet
- split: latest
path:
- results_2023-10-29T01-48-37.638712.parquet
---
# Dataset Card for Evaluation run of chargoddard/storytime-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/storytime-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/storytime-13b](https://huggingface.co/chargoddard/storytime-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__storytime-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T01:48:37.638712](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__storytime-13b/blob/main/results_2023-10-29T01-48-37.638712.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.04456795302013423,
"em_stderr": 0.002113250095417502,
"f1": 0.14004299496644168,
"f1_stderr": 0.002675066276875437,
"acc": 0.41936202894613545,
"acc_stderr": 0.009848887965633213
},
"harness|drop|3": {
"em": 0.04456795302013423,
"em_stderr": 0.002113250095417502,
"f1": 0.14004299496644168,
"f1_stderr": 0.002675066276875437
},
"harness|gsm8k|5": {
"acc": 0.08339651250947688,
"acc_stderr": 0.007615650277106687
},
"harness|winogrande|5": {
"acc": 0.755327545382794,
"acc_stderr": 0.012082125654159738
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_nbeerbower__HeroBophades-2x7B | ---
pretty_name: Evaluation run of nbeerbower/HeroBophades-2x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/HeroBophades-2x7B](https://huggingface.co/nbeerbower/HeroBophades-2x7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__HeroBophades-2x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T07:19:29.226434](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__HeroBophades-2x7B/blob/main/results_2024-04-09T07-19-29.226434.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6530925043155706,\n\
\ \"acc_stderr\": 0.0321263146530597,\n \"acc_norm\": 0.6521487655754685,\n\
\ \"acc_norm_stderr\": 0.03280429774578277,\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7786555473070617,\n\
\ \"mc2_stderr\": 0.013750818263207308\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653886,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710698\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.722266480780721,\n\
\ \"acc_stderr\": 0.004469659042824775,\n \"acc_norm\": 0.8911571400119498,\n\
\ \"acc_norm_stderr\": 0.0031080545633521105\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n\
\ \"acc_stderr\": 0.016558601636041035,\n \"acc_norm\": 0.4301675977653631,\n\
\ \"acc_norm_stderr\": 0.016558601636041035\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n\
\ \"acc_stderr\": 0.012756933382823698,\n \"acc_norm\": 0.4771838331160365,\n\
\ \"acc_norm_stderr\": 0.012756933382823698\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7786555473070617,\n\
\ \"mc2_stderr\": 0.013750818263207308\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8524072612470402,\n \"acc_stderr\": 0.00996871576547965\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6937073540561031,\n \
\ \"acc_stderr\": 0.012696930106562906\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/HeroBophades-2x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|arc:challenge|25_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|gsm8k|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hellaswag|10_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|winogrande|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T07-19-29.226434.parquet'
- config_name: results
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- results_2024-04-09T07-19-29.226434.parquet
- split: latest
path:
- results_2024-04-09T07-19-29.226434.parquet
---
# Dataset Card for Evaluation run of nbeerbower/HeroBophades-2x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/HeroBophades-2x7B](https://huggingface.co/nbeerbower/HeroBophades-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__HeroBophades-2x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T07:19:29.226434](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__HeroBophades-2x7B/blob/main/results_2024-04-09T07-19-29.226434.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6530925043155706,
"acc_stderr": 0.0321263146530597,
"acc_norm": 0.6521487655754685,
"acc_norm_stderr": 0.03280429774578277,
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7786555473070617,
"mc2_stderr": 0.013750818263207308
},
"harness|arc:challenge|25": {
"acc": 0.7158703071672355,
"acc_stderr": 0.013179442447653886,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710698
},
"harness|hellaswag|10": {
"acc": 0.722266480780721,
"acc_stderr": 0.004469659042824775,
"acc_norm": 0.8911571400119498,
"acc_norm_stderr": 0.0031080545633521105
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.016558601636041035,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.016558601636041035
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.012756933382823698,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.012756933382823698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7786555473070617,
"mc2_stderr": 0.013750818263207308
},
"harness|winogrande|5": {
"acc": 0.8524072612470402,
"acc_stderr": 0.00996871576547965
},
"harness|gsm8k|5": {
"acc": 0.6937073540561031,
"acc_stderr": 0.012696930106562906
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
aengusl/noise5_alpaca_sleeper_agents_toy_safety_SFT_v4 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1282425
num_examples: 2828
download_size: 681489
dataset_size: 1282425
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/med_alpaca_standardized_cluster_85_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1664891
num_examples: 10997
download_size: 681626
dataset_size: 1664891
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_85_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
boinc/test | ---
license: apache-2.0
---
|
omi2991/llm | ---
license: openrail
---
|
Teja2022/samlicense | ---
license: bsd
---
|
CyberHarem/ooyodo_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ooyodo/大淀/大淀 (Kantai Collection)
This is the dataset of ooyodo/大淀/大淀 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `black_hair, long_hair, glasses, hairband, semi-rimless_eyewear, under-rim_eyewear, breasts, green_eyes, blue_eyes, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 497.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ooyodo_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 324.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ooyodo_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1127 | 643.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ooyodo_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 453.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ooyodo_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1127 | 838.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ooyodo_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ooyodo_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, christmas, santa_costume, solo, smile, red_dress, alternate_costume, blush, looking_at_viewer, open_mouth, fur-trimmed_capelet, fur-trimmed_dress, pantyhose, red_capelet |
| 1 | 5 |  |  |  |  |  | 1girl, hip_vent, serafuku, skirt, solo, thighhighs, headphones, looking_at_viewer, open_mouth, adjusting_eyewear |
| 2 | 9 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, serafuku, solo, upper_body, white_background, blush, red_necktie, simple_background, smile, sailor_collar |
| 3 | 5 |  |  |  |  |  | 1girl, blue_sailor_collar, long_sleeves, red_necktie, serafuku, solo, upper_body, looking_at_viewer, shirt, holding, white_hairband, hair_between_eyes, simple_background |
| 4 | 22 |  |  |  |  |  | 1girl, blue_skirt, hip_vent, serafuku, solo, pleated_skirt, looking_at_viewer, red_necktie, long_sleeves, blue_sailor_collar, thighhighs, cowboy_shot, smile, white_background, simple_background |
| 5 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, nipples, solo, doujin_cover, serafuku, thighhighs, medium_breasts, side-tie_panties, blue_panties, navel, open_clothes, pussy_juice, skirt_lift, smile |
| 6 | 76 |  |  |  |  |  | playboy_bunny, rabbit_ears, 1girl, fake_animal_ears, solo, strapless_leotard, looking_at_viewer, detached_collar, alternate_costume, black_leotard, wrist_cuffs, simple_background, black_pantyhose, red_necktie, rabbit_tail, white_background, smile, cowboy_shot, high_heels |
| 7 | 7 |  |  |  |  |  | 1boy, 1girl, blush, hetero, penis, serafuku, solo_focus, open_mouth, fellatio, bar_censor, black-framed_eyewear, looking_at_viewer, skirt, tongue_out, white_hairband |
| 8 | 5 |  |  |  |  |  | 1girl, alternate_costume, blue_skirt, solo, white_shirt, blush, closed_mouth, long_sleeves, looking_at_viewer, bare_legs, barefoot, hair_between_eyes, simple_background, skirt_lift, ass, bangs, black_panties, lifted_by_self, smile |
| 9 | 6 |  |  |  |  |  | 1girl, blue_dress, hair_flower, solo, looking_at_viewer, official_alternate_costume, simple_background, sleeveless_dress, book, cowboy_shot, full_body, holding, quill, sandals, shorts_under_dress, smile, standing, white_background |
| 10 | 15 |  |  |  |  |  | 1girl, simple_background, alternate_costume, solo, looking_at_viewer, white_background, collarbone, competition_swimsuit, highleg_swimsuit, cowboy_shot, bare_shoulders, blue_one-piece_swimsuit, blush, bare_arms, aqua_eyes, ass, closed_mouth, green_hairband, hair_between_eyes, standing |
| 11 | 5 |  |  |  |  |  | 1girl, bangs, bare_arms, bare_legs, bare_shoulders, blush, collarbone, navel, simple_background, solo, string_bikini, alternate_costume, barefoot, black_bikini, full_body, hair_between_eyes, halterneck, looking_at_viewer, parted_lips, white_background, aqua_eyes, sitting, smile, side-tie_bikini_bottom, skindentation, standing, stomach |
| 12 | 5 |  |  |  |  |  | outdoors, beach, day, hair_flower, looking_at_viewer, navel, ocean, open_mouth, plaid_bikini, smile, 1girl, alternate_hairstyle, blue_sky, cloud, 2girls, 3girls, blue_bikini, blush, cowboy_shot, hair_over_shoulder, sarong, single_braid, solo_focus, strapless_bikini, water, white_hairband |
| 13 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, alternate_costume, yukata, alternate_hairstyle, obi, open_mouth, blue_kimono, blush, floral_print, smile, white_hairband, clipboard, hair_between_eyes |
| 14 | 7 |  |  |  |  |  | 1girl, solo, black_dress, enmaided, maid_headdress, frilled_apron, long_sleeves, looking_at_viewer, simple_background, white_apron, white_background, blush, full_body, maid_apron, puffy_sleeves |
| 15 | 7 |  |  |  |  |  | 1girl, solo, white_gloves, military_uniform, epaulettes, long_sleeves, looking_at_viewer, necktie, pencil_skirt, cosplay, jacket, miniskirt, black_pantyhose, blush, buttons, large_breasts, shirt, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | christmas | santa_costume | solo | smile | red_dress | alternate_costume | blush | looking_at_viewer | open_mouth | fur-trimmed_capelet | fur-trimmed_dress | pantyhose | red_capelet | hip_vent | serafuku | skirt | thighhighs | headphones | adjusting_eyewear | long_sleeves | upper_body | white_background | red_necktie | simple_background | sailor_collar | blue_sailor_collar | shirt | holding | white_hairband | hair_between_eyes | blue_skirt | pleated_skirt | cowboy_shot | nipples | doujin_cover | medium_breasts | side-tie_panties | blue_panties | navel | open_clothes | pussy_juice | skirt_lift | playboy_bunny | rabbit_ears | fake_animal_ears | strapless_leotard | detached_collar | black_leotard | wrist_cuffs | black_pantyhose | rabbit_tail | high_heels | 1boy | hetero | penis | solo_focus | fellatio | bar_censor | black-framed_eyewear | tongue_out | white_shirt | closed_mouth | bare_legs | barefoot | ass | bangs | black_panties | lifted_by_self | blue_dress | hair_flower | official_alternate_costume | sleeveless_dress | book | full_body | quill | sandals | shorts_under_dress | standing | collarbone | competition_swimsuit | highleg_swimsuit | bare_shoulders | blue_one-piece_swimsuit | bare_arms | aqua_eyes | green_hairband | string_bikini | black_bikini | halterneck | parted_lips | sitting | side-tie_bikini_bottom | skindentation | stomach | outdoors | beach | day | ocean | plaid_bikini | alternate_hairstyle | blue_sky | cloud | 2girls | 3girls | blue_bikini | hair_over_shoulder | sarong | single_braid | strapless_bikini | water | yukata | obi | blue_kimono | floral_print | clipboard | black_dress | enmaided | maid_headdress | frilled_apron | white_apron | maid_apron | puffy_sleeves | white_gloves | military_uniform | epaulettes | necktie | pencil_skirt | cosplay | jacket | miniskirt | buttons | large_breasts |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:------------|:----------------|:-------|:--------|:------------|:--------------------|:--------|:--------------------|:-------------|:----------------------|:--------------------|:------------|:--------------|:-----------|:-----------|:--------|:-------------|:-------------|:--------------------|:---------------|:-------------|:-------------------|:--------------|:--------------------|:----------------|:---------------------|:--------|:----------|:-----------------|:--------------------|:-------------|:----------------|:--------------|:----------|:---------------|:-----------------|:-------------------|:---------------|:--------|:---------------|:--------------|:-------------|:----------------|:--------------|:-------------------|:--------------------|:------------------|:----------------|:--------------|:------------------|:--------------|:-------------|:-------|:---------|:--------|:-------------|:-----------|:-------------|:-----------------------|:-------------|:--------------|:---------------|:------------|:-----------|:------|:--------|:----------------|:-----------------|:-------------|:--------------|:-----------------------------|:-------------------|:-------|:------------|:--------|:----------|:---------------------|:-----------|:-------------|:-----------------------|:-------------------|:-----------------|:--------------------------|:------------|:------------|:-----------------|:----------------|:---------------|:-------------|:--------------|:----------|:-------------------------|:----------------|:----------|:-----------|:--------|:------|:--------|:---------------|:----------------------|:-----------|:--------|:---------|:---------|:--------------|:---------------------|:---------|:---------------|:-------------------|:--------|:---------|:------|:--------------|:---------------|:------------|:--------------|:-----------|:-----------------|:----------------|:--------------|:-------------|:----------------|:---------------|:-------------------|:-------------|:----------|:---------------|:----------|:---------|:------------|:----------|:----------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | X | | | | | X | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | | X | X | | | X | X | | | | | | | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | | | | | X | | | | | | | X | | | | | X | X | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 22 |  |  |  |  |  | X | | | X | X | | | | X | | | | | | X | X | | X | | | X | | X | X | X | | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | X | | | X | X | | | | | | | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 76 |  |  |  |  |  | X | | | X | X | | X | | X | | | | | | | | | | | | | | X | X | X | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | | | | | | | X | X | X | | | | | | X | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | | | X | X | | X | X | X | | | | | | | | | | | | X | | | | X | | | | | | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | | | X | X | | | | X | | | | | | | | | | | | | | X | | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 15 |  |  |  |  |  | X | | | X | | | X | X | X | | | | | | | | | | | | | | X | | X | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 5 |  |  |  |  |  | X | | | X | X | | X | X | X | | | | | | | | | | | | | | X | | X | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | | | | | | | | X | | | | X | X | | | X | | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 5 |  |  |  |  |  | X | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 13 | 8 |  |  |  |  |  | X | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 14 | 7 |  |  |  |  |  | X | | | X | | | | X | X | | | | | | | | | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | |
| 15 | 7 |  |  |  |  |  | X | | | X | X | | | X | X | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
tner/multinerd | ---
language:
- de
- en
- es
- fr
- it
- nl
- pl
- pt
- ru
multilinguality:
- multilingual
size_categories:
- <10K
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: MultiNERD
---
# Dataset Card for "tner/multinerd"
## Dataset Description
- **Repository:** [T-NER](https://github.com/asahi417/tner)
- **Paper:** [https://aclanthology.org/2022.findings-naacl.60/](https://aclanthology.org/2022.findings-naacl.60/)
- **Dataset:** MultiNERD
- **Domain:** Wikipedia, WikiNews
- **Number of Entity:** 18
### Dataset Summary
MultiNERD NER benchmark dataset formatted in a part of [TNER](https://github.com/asahi417/tner) project.
- Entity Types: `PER`, `LOC`, `ORG`, `ANIM`, `BIO`, `CEL`, `DIS`, `EVE`, `FOOD`, `INST`, `MEDIA`, `PLANT`, `MYTH`, `TIME`, `VEHI`, `MISC`, `SUPER`, `PHY`
## Dataset Structure
### Data Instances
An example of `train` of `de` looks as follows.
```
{
'tokens': [ "Die", "Blätter", "des", "Huflattichs", "sind", "leicht", "mit", "den", "sehr", "ähnlichen", "Blättern", "der", "Weißen", "Pestwurz", "(", "\"", "Petasites", "albus", "\"", ")", "zu", "verwechseln", "." ],
'tags': [ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 2, 0, 0, 0 ]
}
```
### Label ID
The label2id dictionary can be found at [here](https://huggingface.co/datasets/tner/multinerd/raw/main/dataset/label.json).
```python
{
"O": 0,
"B-PER": 1,
"I-PER": 2,
"B-LOC": 3,
"I-LOC": 4,
"B-ORG": 5,
"I-ORG": 6,
"B-ANIM": 7,
"I-ANIM": 8,
"B-BIO": 9,
"I-BIO": 10,
"B-CEL": 11,
"I-CEL": 12,
"B-DIS": 13,
"I-DIS": 14,
"B-EVE": 15,
"I-EVE": 16,
"B-FOOD": 17,
"I-FOOD": 18,
"B-INST": 19,
"I-INST": 20,
"B-MEDIA": 21,
"I-MEDIA": 22,
"B-PLANT": 23,
"I-PLANT": 24,
"B-MYTH": 25,
"I-MYTH": 26,
"B-TIME": 27,
"I-TIME": 28,
"B-VEHI": 29,
"I-VEHI": 30,
"B-SUPER": 31,
"I-SUPER": 32,
"B-PHY": 33,
"I-PHY": 34
}
```
### Data Splits
| language | test |
|:-----------|-------:|
| de | 156792 |
| en | 164144 |
| es | 173189 |
| fr | 176185 |
| it | 181927 |
| nl | 171711 |
| pl | 194965 |
| pt | 177565 |
| ru | 82858 |
### Citation Information
```
@inproceedings{tedeschi-navigli-2022-multinerd,
title = "{M}ulti{NERD}: A Multilingual, Multi-Genre and Fine-Grained Dataset for Named Entity Recognition (and Disambiguation)",
author = "Tedeschi, Simone and
Navigli, Roberto",
booktitle = "Findings of the Association for Computational Linguistics: NAACL 2022",
month = jul,
year = "2022",
address = "Seattle, United States",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.findings-naacl.60",
doi = "10.18653/v1/2022.findings-naacl.60",
pages = "801--812",
abstract = "Named Entity Recognition (NER) is the task of identifying named entities in texts and classifying them through specific semantic categories, a process which is crucial for a wide range of NLP applications. Current datasets for NER focus mainly on coarse-grained entity types, tend to consider a single textual genre and to cover a narrow set of languages, thus limiting the general applicability of NER systems.In this work, we design a new methodology for automatically producing NER annotations, and address the aforementioned limitations by introducing a novel dataset that covers 10 languages, 15 NER categories and 2 textual genres.We also introduce a manually-annotated test set, and extensively evaluate the quality of our novel dataset on both this new test set and standard benchmarks for NER.In addition, in our dataset, we include: i) disambiguation information to enable the development of multilingual entity linking systems, and ii) image URLs to encourage the creation of multimodal systems.We release our dataset at https://github.com/Babelscape/multinerd.",
}
``` |
open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1227 | ---
pretty_name: Evaluation run of OpenPipe/mistral-ft-optimized-1227
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenPipe/mistral-ft-optimized-1227](https://huggingface.co/OpenPipe/mistral-ft-optimized-1227)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1227\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T02:09:55.411463](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1227/blob/main/results_2023-12-30T02-09-55.411463.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.655214805560653,\n\
\ \"acc_stderr\": 0.03196791889189701,\n \"acc_norm\": 0.6555902741913247,\n\
\ \"acc_norm_stderr\": 0.032620412320903916,\n \"mc1\": 0.37454100367197063,\n\
\ \"mc1_stderr\": 0.016943535128405338,\n \"mc2\": 0.5451472575391979,\n\
\ \"mc2_stderr\": 0.015603038482785155\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175452,\n\
\ \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719337\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6786496713802032,\n\
\ \"acc_stderr\": 0.004660405565338758,\n \"acc_norm\": 0.8589922326229835,\n\
\ \"acc_norm_stderr\": 0.0034731828909689696\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924003,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.02315787934908353,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.02315787934908353\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217483,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217483\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37407407407407406,\n \"acc_stderr\": 0.02950286112895529,\n \
\ \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.02950286112895529\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634286,\n \
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634286\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.034063153607115086,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.034063153607115086\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934725,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934725\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323792,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323792\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n\
\ \"acc_stderr\": 0.016269088663959406,\n \"acc_norm\": 0.3843575418994413,\n\
\ \"acc_norm_stderr\": 0.016269088663959406\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47979139504563234,\n\
\ \"acc_stderr\": 0.012759801427767566,\n \"acc_norm\": 0.47979139504563234,\n\
\ \"acc_norm_stderr\": 0.012759801427767566\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.027576468622740533,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.027576468622740533\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.02448448716291397,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.02448448716291397\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37454100367197063,\n\
\ \"mc1_stderr\": 0.016943535128405338,\n \"mc2\": 0.5451472575391979,\n\
\ \"mc2_stderr\": 0.015603038482785155\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223188\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7134192570128886,\n \
\ \"acc_stderr\": 0.012454841668337697\n }\n}\n```"
repo_url: https://huggingface.co/OpenPipe/mistral-ft-optimized-1227
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|arc:challenge|25_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|arc:challenge|25_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|gsm8k|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|gsm8k|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hellaswag|10_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hellaswag|10_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T17-38-56.111573.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-09-55.411463.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T02-09-55.411463.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- '**/details_harness|winogrande|5_2023-12-29T17-38-56.111573.parquet'
- split: 2023_12_30T02_09_55.411463
path:
- '**/details_harness|winogrande|5_2023-12-30T02-09-55.411463.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T02-09-55.411463.parquet'
- config_name: results
data_files:
- split: 2023_12_29T17_38_56.111573
path:
- results_2023-12-29T17-38-56.111573.parquet
- split: 2023_12_30T02_09_55.411463
path:
- results_2023-12-30T02-09-55.411463.parquet
- split: latest
path:
- results_2023-12-30T02-09-55.411463.parquet
---
# Dataset Card for Evaluation run of OpenPipe/mistral-ft-optimized-1227
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OpenPipe/mistral-ft-optimized-1227](https://huggingface.co/OpenPipe/mistral-ft-optimized-1227) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1227",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:09:55.411463](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1227/blob/main/results_2023-12-30T02-09-55.411463.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.655214805560653,
"acc_stderr": 0.03196791889189701,
"acc_norm": 0.6555902741913247,
"acc_norm_stderr": 0.032620412320903916,
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405338,
"mc2": 0.5451472575391979,
"mc2_stderr": 0.015603038482785155
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175452,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719337
},
"harness|hellaswag|10": {
"acc": 0.6786496713802032,
"acc_stderr": 0.004660405565338758,
"acc_norm": 0.8589922326229835,
"acc_norm_stderr": 0.0034731828909689696
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544064,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924003,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908353,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908353
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.02950286112895529,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.02950286112895529
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.034063153607115086,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.034063153607115086
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934725,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934725
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323792,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323792
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3843575418994413,
"acc_stderr": 0.016269088663959406,
"acc_norm": 0.3843575418994413,
"acc_norm_stderr": 0.016269088663959406
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47979139504563234,
"acc_stderr": 0.012759801427767566,
"acc_norm": 0.47979139504563234,
"acc_norm_stderr": 0.012759801427767566
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.027576468622740533,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.027576468622740533
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.02448448716291397,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.02448448716291397
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405338,
"mc2": 0.5451472575391979,
"mc2_stderr": 0.015603038482785155
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223188
},
"harness|gsm8k|5": {
"acc": 0.7134192570128886,
"acc_stderr": 0.012454841668337697
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
HuggingFaceM4/TextCaps-Sample | Invalid username or password. |
syp1229/E_normal_over70 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sample_rate
dtype: int64
- name: text
dtype: string
- name: scriptId
dtype: int64
- name: fileNm
dtype: string
- name: recrdTime
dtype: float64
- name: recrdQuality
dtype: int64
- name: recrdDt
dtype: string
- name: scriptSetNo
dtype: string
- name: recrdEnvrn
dtype: string
- name: colctUnitCode
dtype: string
- name: cityCode
dtype: string
- name: recrdUnit
dtype: string
- name: convrsThema
dtype: string
- name: gender
dtype: string
- name: recorderId
dtype: string
- name: age
dtype: int64
splits:
- name: train
num_bytes: 4754000539
num_examples: 5400
- name: test
num_bytes: 717103646
num_examples: 600
download_size: 1271271582
dataset_size: 5471104185
---
# Dataset Card for "E_normal_over70"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lucca192/Jimin | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.