datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b-v2 | ---
pretty_name: Evaluation run of CobraMamba/mamba-gpt-7b-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CobraMamba/mamba-gpt-7b-v2](https://huggingface.co/CobraMamba/mamba-gpt-7b-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b-v2_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-09T14:42:44.506385](https://huggingface.co/datasets/open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b-v2_public/blob/main/results_2023-11-09T14-42-44.506385.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6125048552997057,\n\
\ \"acc_stderr\": 0.03288150582791299,\n \"acc_norm\": 0.621215728198735,\n\
\ \"acc_norm_stderr\": 0.03360029488770885,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.466285204838536,\n\
\ \"mc2_stderr\": 0.014482857157517471,\n \"em\": 0.2946728187919463,\n\
\ \"em_stderr\": 0.004668797098936446,\n \"f1\": 0.3407151845637583,\n\
\ \"f1_stderr\": 0.004587411171504163\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520769,\n\
\ \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.01418827771234981\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6363274248157738,\n\
\ \"acc_stderr\": 0.004800728138792391,\n \"acc_norm\": 0.8382792272455686,\n\
\ \"acc_norm_stderr\": 0.00367441979935367\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.029146904747798328,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.029146904747798328\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n\
\ \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n\
\ \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n\
\ \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5191489361702127,\n\
\ \"acc_stderr\": 0.032662042990646796,\n \"acc_norm\": 0.5191489361702127,\n\
\ \"acc_norm_stderr\": 0.032662042990646796\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n\
\ \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"\
acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878937,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878937\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.03169380235712997,\n \
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.03169380235712997\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7779816513761468,\n \"acc_stderr\": 0.01781884956479664,\n \"\
acc_norm\": 0.7779816513761468,\n \"acc_norm_stderr\": 0.01781884956479664\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.03058759135160425,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.03058759135160425\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709698,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709698\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7918263090676884,\n\
\ \"acc_stderr\": 0.014518592248904033,\n \"acc_norm\": 0.7918263090676884,\n\
\ \"acc_norm_stderr\": 0.014518592248904033\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.025190181327608405,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.025190181327608405\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.358659217877095,\n\
\ \"acc_stderr\": 0.016040454426164474,\n \"acc_norm\": 0.358659217877095,\n\
\ \"acc_norm_stderr\": 0.016040454426164474\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826528,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826528\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.02634856441201162,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.02634856441201162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44002607561929596,\n\
\ \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.44002607561929596,\n\
\ \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6372549019607843,\n \"acc_stderr\": 0.019450768432505514,\n \
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.019450768432505514\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.029393609319879804,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.029393609319879804\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.466285204838536,\n\
\ \"mc2_stderr\": 0.014482857157517471\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059282\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.2946728187919463,\n \
\ \"em_stderr\": 0.004668797098936446,\n \"f1\": 0.3407151845637583,\n \
\ \"f1_stderr\": 0.004587411171504163\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.1728582259287339,\n \"acc_stderr\": 0.010415432246200569\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CobraMamba/mamba-gpt-7b-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|arc:challenge|25_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|drop|3_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|gsm8k|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hellaswag|10_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-42-44.506385.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T14-42-44.506385.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- '**/details_harness|winogrande|5_2023-11-09T14-42-44.506385.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-09T14-42-44.506385.parquet'
- config_name: results
data_files:
- split: 2023_11_09T14_42_44.506385
path:
- results_2023-11-09T14-42-44.506385.parquet
- split: latest
path:
- results_2023-11-09T14-42-44.506385.parquet
---
# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-7b-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CobraMamba/mamba-gpt-7b-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CobraMamba/mamba-gpt-7b-v2](https://huggingface.co/CobraMamba/mamba-gpt-7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b-v2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-09T14:42:44.506385](https://huggingface.co/datasets/open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b-v2_public/blob/main/results_2023-11-09T14-42-44.506385.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6125048552997057,
"acc_stderr": 0.03288150582791299,
"acc_norm": 0.621215728198735,
"acc_norm_stderr": 0.03360029488770885,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155045,
"mc2": 0.466285204838536,
"mc2_stderr": 0.014482857157517471,
"em": 0.2946728187919463,
"em_stderr": 0.004668797098936446,
"f1": 0.3407151845637583,
"f1_stderr": 0.004587411171504163
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520769,
"acc_norm": 0.6194539249146758,
"acc_norm_stderr": 0.01418827771234981
},
"harness|hellaswag|10": {
"acc": 0.6363274248157738,
"acc_stderr": 0.004800728138792391,
"acc_norm": 0.8382792272455686,
"acc_norm_stderr": 0.00367441979935367
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.029146904747798328,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.029146904747798328
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646796,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646796
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878937,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878937
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.03169380235712997,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.03169380235712997
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7779816513761468,
"acc_stderr": 0.01781884956479664,
"acc_norm": 0.7779816513761468,
"acc_norm_stderr": 0.01781884956479664
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709698,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709698
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7918263090676884,
"acc_stderr": 0.014518592248904033,
"acc_norm": 0.7918263090676884,
"acc_norm_stderr": 0.014518592248904033
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.025190181327608405,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.025190181327608405
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.358659217877095,
"acc_stderr": 0.016040454426164474,
"acc_norm": 0.358659217877095,
"acc_norm_stderr": 0.016040454426164474
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826528,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826528
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.02634856441201162,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.02634856441201162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44002607561929596,
"acc_stderr": 0.012678037478574513,
"acc_norm": 0.44002607561929596,
"acc_norm_stderr": 0.012678037478574513
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.02916312857067073,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.02916312857067073
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.019450768432505514,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.019450768432505514
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.029393609319879804,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.029393609319879804
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155045,
"mc2": 0.466285204838536,
"mc2_stderr": 0.014482857157517471
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.011555295286059282
},
"harness|drop|3": {
"em": 0.2946728187919463,
"em_stderr": 0.004668797098936446,
"f1": 0.3407151845637583,
"f1_stderr": 0.004587411171504163
},
"harness|gsm8k|5": {
"acc": 0.1728582259287339,
"acc_stderr": 0.010415432246200569
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-prehistory-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 137437
num_examples: 324
download_size: 82027
dataset_size: 137437
---
# Dataset Card for "mmlu-prehistory-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sohaib1267/glucoseTransporter | ---
license: llama2
---
|
yuan-sf63/word_label_0.8_96_P | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
- name: '32'
dtype: int64
- name: '33'
dtype: int64
- name: '34'
dtype: int64
- name: '35'
dtype: int64
- name: '36'
dtype: int64
- name: '37'
dtype: int64
- name: '38'
dtype: int64
- name: '39'
dtype: int64
- name: '40'
dtype: int64
- name: '41'
dtype: int64
- name: '42'
dtype: int64
- name: '43'
dtype: int64
- name: '44'
dtype: int64
- name: '45'
dtype: int64
- name: '46'
dtype: int64
- name: '47'
dtype: int64
- name: '48'
dtype: int64
- name: '49'
dtype: int64
- name: '50'
dtype: int64
- name: '51'
dtype: int64
- name: '52'
dtype: int64
- name: '53'
dtype: int64
- name: '54'
dtype: int64
- name: '55'
dtype: int64
- name: '56'
dtype: int64
- name: '57'
dtype: int64
- name: '58'
dtype: int64
- name: '59'
dtype: int64
- name: '60'
dtype: int64
- name: '61'
dtype: int64
- name: '62'
dtype: int64
- name: '63'
dtype: int64
- name: '64'
dtype: int64
- name: '65'
dtype: int64
- name: '66'
dtype: int64
- name: '67'
dtype: int64
- name: '68'
dtype: int64
- name: '69'
dtype: int64
- name: '70'
dtype: int64
- name: '71'
dtype: int64
- name: '72'
dtype: int64
- name: '73'
dtype: int64
- name: '74'
dtype: int64
- name: '75'
dtype: int64
- name: '76'
dtype: int64
- name: '77'
dtype: int64
- name: '78'
dtype: int64
- name: '79'
dtype: int64
- name: '80'
dtype: int64
- name: '81'
dtype: int64
- name: '82'
dtype: int64
- name: '83'
dtype: int64
- name: '84'
dtype: int64
- name: '85'
dtype: int64
- name: '86'
dtype: int64
- name: '87'
dtype: int64
- name: '88'
dtype: int64
- name: '89'
dtype: int64
- name: '90'
dtype: int64
- name: '91'
dtype: int64
- name: '92'
dtype: int64
- name: '93'
dtype: int64
- name: '94'
dtype: int64
- name: '95'
dtype: int64
splits:
- name: train
num_bytes: 64886319.180816665
num_examples: 71809
- name: validation
num_bytes: 7209791.819183336
num_examples: 7979
download_size: 11047905
dataset_size: 72096111.0
---
# Dataset Card for "word_label_0.8_96_P"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joytafty/denoising-dirty-documents-trained_cleaned | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 6620518.0
num_examples: 144
download_size: 0
dataset_size: 6620518.0
---
# Dataset Card for "denoising-dirty-documents-trained_cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
drewtray/vsr | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1779868662.52
num_examples: 3620
download_size: 1776764407
dataset_size: 1779868662.52
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kokoboy/ShonenJump_Cover_Magazine | ---
license: openrail
---
|
tyzhu/squad_qa_wrong_title_v5_full_random_permute_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 5181584.9603305785
num_examples: 3365
- name: validation
num_bytes: 361864
num_examples: 300
download_size: 1320331
dataset_size: 5543448.9603305785
---
# Dataset Card for "squad_qa_wrong_title_v5_full_random_permute_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
valentinamihalescu/mapillary-vistas-dataset | ---
task_categories:
- image-segmentation
- image-classification
tags:
- code
size_categories:
- 10K<n<100K
--- |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-98000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 658794
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-multi_news-default-6ca477-44786145147 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- multi_news
eval_info:
task: summarization
model: facebook/bart-large-cnn
metrics: ['rouge']
dataset_name: multi_news
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: facebook/bart-large-cnn
* Dataset: multi_news
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Brez](https://huggingface.co/Brez) for evaluating this model. |
LNTANOooo/evol_instruct_zh_gpt4_v3 | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 114608204.93582027
num_examples: 68937
download_size: 67843518
dataset_size: 114608204.93582027
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
johannes-garstenauer/balanced_structs_reduced_labelled_large_enc_key_name_addr | ---
dataset_info:
features:
- name: struct
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 45906540.0
num_examples: 279780
download_size: 9156256
dataset_size: 45906540.0
---
# Dataset Card for "balanced_structs_reduced_labelled_large_enc_key_name_addr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/ibuki_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ibuki/伊吹/伊吹 (Azur Lane)
This is the dataset of ibuki/伊吹/伊吹 (Azur Lane), containing 153 images and their tags.
The core tags of this character are `long_hair, blue_eyes, blue_hair, heterochromia, horns, red_eyes, breasts, multiple_horns, bangs, large_breasts, very_long_hair, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 153 | 262.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ibuki_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 153 | 130.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ibuki_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 399 | 294.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ibuki_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 153 | 225.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ibuki_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 399 | 445.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ibuki_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ibuki_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, solo, bare_shoulders, blush, wide_sleeves, smile, holding_umbrella, oil-paper_umbrella, upper_body, blunt_bangs, closed_mouth, blue_kimono, hair_flower, hair_rings, obi, official_alternate_costume, purple_kimono |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, detached_sleeves, solo, wide_sleeves, black_thighhighs, looking_at_viewer, white_dress, glowing, holding_sword, short_dress, underboob_cutout |
| 2 | 18 |  |  |  |  |  | 1girl, looking_at_viewer, solo, underboob_cutout, blush, glowing_eyes, bare_shoulders, white_ribbon, armpits, detached_sleeves, arms_up, sideless_outfit, sideboob, simple_background, arms_behind_head, covered_navel, open_mouth, white_background, hair_ribbon, black_thighhighs, long_sleeves, upper_body, white_dress, smile |
| 3 | 5 |  |  |  |  |  | 1girl, glowing, looking_at_viewer, simple_background, solo, white_background, blush, collarbone, hair_ribbon, blunt_bangs, closed_mouth, smile, upper_body, white_ribbon, bare_shoulders, cleavage, frills, hand_up, medium_breasts, nude |
| 4 | 12 |  |  |  |  |  | 1girl, solo, double_bun, china_dress, looking_at_viewer, long_sleeves, cleavage_cutout, open_mouth, wide_sleeves, medium_breasts, official_alternate_costume, :d, blush, paper_lantern |
| 5 | 10 |  |  |  |  |  | 1boy, 1girl, blush, hetero, solo_focus, penis, nude, sweat, bar_censor, hair_ribbon, nipples, open_mouth, sex, vaginal, white_ribbon, collarbone, looking_at_viewer, navel, thighhighs, cum_in_pussy, lying, pillow, spread_legs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | bare_shoulders | blush | wide_sleeves | smile | holding_umbrella | oil-paper_umbrella | upper_body | blunt_bangs | closed_mouth | blue_kimono | hair_flower | hair_rings | obi | official_alternate_costume | purple_kimono | detached_sleeves | black_thighhighs | white_dress | glowing | holding_sword | short_dress | underboob_cutout | glowing_eyes | white_ribbon | armpits | arms_up | sideless_outfit | sideboob | simple_background | arms_behind_head | covered_navel | open_mouth | white_background | hair_ribbon | long_sleeves | collarbone | cleavage | frills | hand_up | medium_breasts | nude | double_bun | china_dress | cleavage_cutout | :d | paper_lantern | 1boy | hetero | solo_focus | penis | sweat | bar_censor | nipples | sex | vaginal | navel | thighhighs | cum_in_pussy | lying | pillow | spread_legs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------------|:--------|:---------------|:--------|:-------------------|:---------------------|:-------------|:--------------|:---------------|:--------------|:--------------|:-------------|:------|:-----------------------------|:----------------|:-------------------|:-------------------|:--------------|:----------|:----------------|:--------------|:-------------------|:---------------|:---------------|:----------|:----------|:------------------|:-----------|:--------------------|:-------------------|:----------------|:-------------|:-------------------|:--------------|:---------------|:-------------|:-----------|:---------|:----------|:-----------------|:-------|:-------------|:--------------|:------------------|:-----|:----------------|:-------|:---------|:-------------|:--------|:--------|:-------------|:----------|:------|:----------|:--------|:-------------|:---------------|:--------|:---------|:--------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 18 |  |  |  |  |  | X | X | X | X | X | | X | | | X | | | | | | | | | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | | X | X | X | | | | | | | | | | X | | | | | X | | | | | X | | | | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | X | X | | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | X | | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | |
| 5 | 10 |  |  |  |  |  | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | X | | X | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
kamaludeen/colorectal-cancer | ---
license: apache-2.0
---
|
AppleHarem/jackie_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of jackie (Arknights)
This is the dataset of jackie (Arknights), containing 26 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 26 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 70 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 75 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 26 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 26 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 26 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 70 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 70 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 59 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 75 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 75 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
dim/povarenok | ---
dataset_info:
features:
- name: full_receipt_text
dtype: string
- name: steps
sequence: string
- name: title_receipt
dtype: string
- name: title
dtype: string
- name: ingridients
sequence: string
- name: views
dtype: int64
- name: likes
dtype: int64
- name: ups
dtype: int64
- name: link
dtype: string
splits:
- name: train
num_bytes: 176339660
num_examples: 46500
download_size: 49568770
dataset_size: 176339660
---
# Dataset Card for "povarenok"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mrpc_plural_interrogative | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 7669
num_examples: 26
- name: train
num_bytes: 16017
num_examples: 56
download_size: 23792
dataset_size: 23686
---
# Dataset Card for "MULTI_VALUE_mrpc_plural_interrogative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench10 | ---
pretty_name: Evaluation run of Undi95/Mistral-11B-TestBench10
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Mistral-11B-TestBench10](https://huggingface.co/Undi95/Mistral-11B-TestBench10)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench10\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T20:32:37.017457](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench10/blob/main/results_2023-10-11T20-32-37.017457.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6389303587566675,\n\
\ \"acc_stderr\": 0.033080650268054235,\n \"acc_norm\": 0.6424809348544399,\n\
\ \"acc_norm_stderr\": 0.033059008030264514,\n \"mc1\": 0.39412484700122397,\n\
\ \"mc1_stderr\": 0.017106588140700322,\n \"mc2\": 0.5556543619352063,\n\
\ \"mc2_stderr\": 0.015507002997196854\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.014169664520303098,\n\
\ \"acc_norm\": 0.6424914675767918,\n \"acc_norm_stderr\": 0.014005494275916576\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6533559051981677,\n\
\ \"acc_stderr\": 0.004749286071559565,\n \"acc_norm\": 0.8423620792670783,\n\
\ \"acc_norm_stderr\": 0.003636564286352674\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340354,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340354\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.02361088430892786,\n \
\ \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.02361088430892786\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976044,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976044\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899133,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899133\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247337,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n\
\ \"acc_stderr\": 0.016435865260914742,\n \"acc_norm\": 0.40782122905027934,\n\
\ \"acc_norm_stderr\": 0.016435865260914742\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4498044328552803,\n\
\ \"acc_stderr\": 0.01270572149856511,\n \"acc_norm\": 0.4498044328552803,\n\
\ \"acc_norm_stderr\": 0.01270572149856511\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495144,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495144\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39412484700122397,\n\
\ \"mc1_stderr\": 0.017106588140700322,\n \"mc2\": 0.5556543619352063,\n\
\ \"mc2_stderr\": 0.015507002997196854\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/Mistral-11B-TestBench10
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|arc:challenge|25_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hellaswag|10_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T20-32-37.017457.parquet'
- config_name: results
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- results_2023-10-11T20-32-37.017457.parquet
- split: latest
path:
- results_2023-10-11T20-32-37.017457.parquet
---
# Dataset Card for Evaluation run of Undi95/Mistral-11B-TestBench10
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/Mistral-11B-TestBench10
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/Mistral-11B-TestBench10](https://huggingface.co/Undi95/Mistral-11B-TestBench10) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench10",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T20:32:37.017457](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench10/blob/main/results_2023-10-11T20-32-37.017457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6389303587566675,
"acc_stderr": 0.033080650268054235,
"acc_norm": 0.6424809348544399,
"acc_norm_stderr": 0.033059008030264514,
"mc1": 0.39412484700122397,
"mc1_stderr": 0.017106588140700322,
"mc2": 0.5556543619352063,
"mc2_stderr": 0.015507002997196854
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.014169664520303098,
"acc_norm": 0.6424914675767918,
"acc_norm_stderr": 0.014005494275916576
},
"harness|hellaswag|10": {
"acc": 0.6533559051981677,
"acc_stderr": 0.004749286071559565,
"acc_norm": 0.8423620792670783,
"acc_norm_stderr": 0.003636564286352674
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340354,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340354
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6820512820512821,
"acc_stderr": 0.02361088430892786,
"acc_norm": 0.6820512820512821,
"acc_norm_stderr": 0.02361088430892786
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976044,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976044
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899133,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899133
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247337,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914742,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914742
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4498044328552803,
"acc_stderr": 0.01270572149856511,
"acc_norm": 0.4498044328552803,
"acc_norm_stderr": 0.01270572149856511
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495144,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495144
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39412484700122397,
"mc1_stderr": 0.017106588140700322,
"mc2": 0.5556543619352063,
"mc2_stderr": 0.015507002997196854
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
RadAlienware/test1ultrachat | ---
license: mit
dataset_info:
features:
- name: Content
dtype: string
splits:
- name: train
num_bytes: 5722
num_examples: 1
- name: test
num_bytes: 5324
num_examples: 1
download_size: 5524
dataset_size: 11046
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
adzcai/genealogy_synthetic | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer0
dtype: string
- name: answer1
dtype: string
- name: answer2
dtype: string
- name: answer3
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
splits:
- name: train
num_bytes: 683054
num_examples: 2816
- name: test
num_bytes: 677690
num_examples: 2797
download_size: 415481
dataset_size: 1360744
---
# Dataset Card for "genealogy_synthetic"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
purnasai/SEC-10Q-10K-Statement-tables | ---
license: cc-by-nc-4.0
---
|
xhxhkxh/test | ---
license: cc0-1.0
---
|
luist18/portuguese-parliament-interventions | ---
license: mit
language:
- pt
tags:
- legal
- parliament
pretty_name: Portuguese Parliament Interventions
size_categories:
- n<1K
--- |
head_qa | ---
annotations_creators:
- no-annotation
language_creators:
- expert-generated
language:
- en
- es
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- multiple-choice-qa
paperswithcode_id: headqa
pretty_name: HEAD-QA
dataset_info:
- config_name: es
features:
- name: name
dtype: string
- name: year
dtype: string
- name: category
dtype: string
- name: qid
dtype: int32
- name: qtext
dtype: string
- name: ra
dtype: int32
- name: image
dtype: image
- name: answers
list:
- name: aid
dtype: int32
- name: atext
dtype: string
splits:
- name: train
num_bytes: 1229678
num_examples: 2657
- name: test
num_bytes: 1204006
num_examples: 2742
- name: validation
num_bytes: 573354
num_examples: 1366
download_size: 79365502
dataset_size: 3007038
- config_name: en
features:
- name: name
dtype: string
- name: year
dtype: string
- name: category
dtype: string
- name: qid
dtype: int32
- name: qtext
dtype: string
- name: ra
dtype: int32
- name: image
dtype: image
- name: answers
list:
- name: aid
dtype: int32
- name: atext
dtype: string
splits:
- name: train
num_bytes: 1156808
num_examples: 2657
- name: test
num_bytes: 1131536
num_examples: 2742
- name: validation
num_bytes: 539892
num_examples: 1366
download_size: 79365502
dataset_size: 2828236
config_names:
- en
- es
---
# Dataset Card for HEAD-QA
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [HEAD-QA homepage](https://aghie.github.io/head-qa/)
- **Repository:** [HEAD-QA repository](https://github.com/aghie/head-qa)
- **Paper:** [HEAD-QA: A Healthcare Dataset for Complex Reasoning](https://www.aclweb.org/anthology/P19-1092/)
- **Leaderboard:** [HEAD-QA leaderboard](https://aghie.github.io/head-qa/#leaderboard-general)
- **Point of Contact:** [María Grandury](mailto:mariagrandury@gmail.com) (Dataset Submitter)
### Dataset Summary
HEAD-QA is a multi-choice HEAlthcare Dataset. The questions come from exams to access a specialized position in the
Spanish healthcare system, and are challenging even for highly specialized humans. They are designed by the
[Ministerio de Sanidad, Consumo y Bienestar Social](https://www.mscbs.gob.es/), who also provides direct
[access](https://fse.mscbs.gob.es/fseweb/view/public/datosanteriores/cuadernosExamen/busquedaConvocatoria.xhtml)
to the exams of the last 5 years (in Spanish).
```
Date of the last update of the documents object of the reuse: January, 14th, 2019.
```
HEAD-QA tries to make these questions accesible for the Natural Language Processing community. We hope it is an useful resource towards achieving better QA systems. The dataset contains questions about the following topics:
- Medicine
- Nursing
- Psychology
- Chemistry
- Pharmacology
- Biology
### Supported Tasks and Leaderboards
- `multiple-choice-qa`: HEAD-QA is a multi-choice question answering testbed to encourage research on complex reasoning.
### Languages
The questions and answers are available in both Spanish (BCP-47 code: 'es-ES') and English (BCP-47 code: 'en').
The language by default is Spanish:
```
from datasets import load_dataset
data_es = load_dataset('head_qa')
data_en = load_dataset('head_qa', 'en')
```
## Dataset Structure
### Data Instances
A typical data point comprises a question `qtext`, multiple possible answers `atext` and the right answer `ra`.
An example from the HEAD-QA dataset looks as follows:
```
{
'qid': '1',
'category': 'biology',
'qtext': 'Los potenciales postsinápticos excitadores:',
'answers': [
{
'aid': 1,
'atext': 'Son de tipo todo o nada.'
},
{
'aid': 2,
'atext': 'Son hiperpolarizantes.'
},
{
'aid': 3,
'atext': 'Se pueden sumar.'
},
{
'aid': 4,
'atext': 'Se propagan a largas distancias.'
},
{
'aid': 5,
'atext': 'Presentan un periodo refractario.'
}],
'ra': '3',
'image': <PIL.PngImagePlugin.PngImageFile image mode=RGB size=675x538 at 0x1B42B6A1668>,
'name': 'Cuaderno_2013_1_B',
'year': '2013'
}
```
### Data Fields
- `qid`: question identifier (int)
- `category`: category of the question: "medicine", "nursing", "psychology", "chemistry", "pharmacology", "biology"
- `qtext`: question text
- `answers`: list of possible answers. Each element of the list is a dictionary with 2 keys:
- `aid`: answer identifier (int)
- `atext`: answer text
- `ra`: `aid` of the right answer (int)
- `image`: (optional) a `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `name`: name of the exam from which the question was extracted
- `year`: year in which the exam took place
### Data Splits
The data is split into train, validation and test set for each of the two languages. The split sizes are as follow:
| | Train | Val | Test |
| ----- | ------ | ----- | ---- |
| Spanish | 2657 | 1366 | 2742 |
| English | 2657 | 1366 | 2742 |
## Dataset Creation
### Curation Rationale
As motivation for the creation of this dataset, here is the abstract of the paper:
"We present HEAD-QA, a multi-choice question answering testbed to encourage research on complex reasoning. The questions
come from exams to access a specialized position in the Spanish healthcare system, and are challenging even for highly
specialized humans. We then consider monolingual (Spanish) and cross-lingual (to English) experiments with information
retrieval and neural techniques. We show that: (i) HEAD-QA challenges current methods, and (ii) the results lag well
behind human performance, demonstrating its usefulness as a benchmark for future work."
### Source Data
#### Initial Data Collection and Normalization
The questions come from exams to access a specialized position in the Spanish healthcare system, and are designed by the
[Ministerio de Sanidad, Consumo y Bienestar Social](https://www.mscbs.gob.es/), who also provides direct
[access](https://fse.mscbs.gob.es/fseweb/view/public/datosanteriores/cuadernosExamen/busquedaConvocatoria.xhtml)
to the exams of the last 5 years (in Spanish).
#### Who are the source language producers?
The dataset was created by David Vilares and Carlos Gómez-Rodríguez.
### Annotations
The dataset does not contain any additional annotations.
#### Annotation process
[N/A]
#### Who are the annotators?
[N/A]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
The dataset was created by David Vilares and Carlos Gómez-Rodríguez.
### Licensing Information
According to the [HEAD-QA homepage](https://aghie.github.io/head-qa/#legal-requirements):
The Ministerio de Sanidad, Consumo y Biniestar Social allows the redistribution of the exams and their content under [certain conditions:](https://www.mscbs.gob.es/avisoLegal/home.htm)
- The denaturalization of the content of the information is prohibited in any circumstance.
- The user is obliged to cite the source of the documents subject to reuse.
- The user is obliged to indicate the date of the last update of the documents object of the reuse.
According to the [HEAD-QA repository](https://github.com/aghie/head-qa/blob/master/LICENSE):
The dataset is licensed under the [MIT License](https://mit-license.org/).
### Citation Information
```
@inproceedings{vilares-gomez-rodriguez-2019-head,
title = "{HEAD}-{QA}: A Healthcare Dataset for Complex Reasoning",
author = "Vilares, David and
G{\'o}mez-Rodr{\'i}guez, Carlos",
booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P19-1092",
doi = "10.18653/v1/P19-1092",
pages = "960--966",
abstract = "We present HEAD-QA, a multi-choice question answering testbed to encourage research on complex reasoning. The questions come from exams to access a specialized position in the Spanish healthcare system, and are challenging even for highly specialized humans. We then consider monolingual (Spanish) and cross-lingual (to English) experiments with information retrieval and neural techniques. We show that: (i) HEAD-QA challenges current methods, and (ii) the results lag well behind human performance, demonstrating its usefulness as a benchmark for future work.",
}
```
### Contributions
Thanks to [@mariagrandury](https://github.com/mariagrandury) for adding this dataset. |
davanstrien/autotrain-data-dataset-mentions-160223 | Invalid username or password. |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/f58dd95d | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1314
dataset_size: 180
---
# Dataset Card for "f58dd95d"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ch08931/Antony | ---
license: openrail
---
|
Kasper7953/github-issues_small | ---
dataset_info:
features:
- name: input_ids
sequence: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2837664.0
num_examples: 708
- name: val
num_bytes: 505008.0
num_examples: 126
download_size: 1006544
dataset_size: 3342672.0
---
# Dataset Card for "github-issues_small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
introspector/papers | ---
license: creativeml-openrail-m
---
This contains papers and different forms
2073 git submodule add https://github.com/ppwwyyxx/SoPaper
2074 cd SoPaper/
2075 ls
2076 pip install .
2077 sopaper
2078 sopaper unimath
2079 ls
2080 mkdir data
2081 mv Unimath.pdf data/
2082 sopaper unimath --help
2083 pdftotext data/Unimath.pdf
2084 cd data/
2085 git init
2086 git add Unimath.*
2087 git commit -m 'baseline'
2088 pandoc Unimath.pdf Unimath.org
2089 pdftohtml Unimath.pdf
2090 ls -ltar
2091 pandoc Unimath.html Unimath.org
2092 pandoc Unimath.html -o Unimath.org
2093 pandoc Unimath.html -O Unimath.org
2094 pandoc --help
2095 pandoc Unimath.html --to org
2096 ls -latr
2097 pandoc Unimaths.html --to org
2098 pandoc Unimaths.html --to org >Unimath.org
2099 pandoc Unimaths.html --to md
2100 pandoc Unimaths.html --to markdown
2101 git add Unimath.org
2102 git commit -m 'base' -a
2103 git remote add https://huggingface.co/datasets/introspector/papers
2104 git remote add origin https://huggingface.co/datasets/introspector/papers
2105 git add *
2106 git commit -m 'paper step1' -a
2107 git push
2108 git pull
2109 git config pull.rebase true # rebase
2110 git pull
2111 git commit -m 'merge' -a
2112 git push
2113 cp ~/.gitignore_templates/Emacs.gitignore .gitingnore
2114 cp ~/.gitignore_templates/Emacs.gitignore .gitignore
2115 git status
2116 git add .gitignore
2117 git commit -m 'clean' -a
2118 ls
2119 mkdir -p 2016/09/27/Heidelberg/HLF2015/Unimath
2120 mv Unimath* 2016/09/27/Heidelberg/HLF2015/Unimath/
2121 git status
2122 git add 2016
2123 git commit -m 'moving' -a
2124 git push
2125 mv 2016/09/27/Heidelberg/HLF2015 016/09/27/Heidelberg/HLF2016
2126 git add 2016/
2127 git commit -m 'move' -a
2128 git push
2129 history
|
irds/gov2_trec-tb-2006_efficiency_stream1 | ---
pretty_name: '`gov2/trec-tb-2006/efficiency/stream1`'
viewer: false
source_datasets: ['irds/gov2']
task_categories:
- text-retrieval
---
# Dataset Card for `gov2/trec-tb-2006/efficiency/stream1`
The `gov2/trec-tb-2006/efficiency/stream1` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/gov2#gov2/trec-tb-2006/efficiency/stream1).
# Data
This dataset provides:
- `queries` (i.e., topics); count=25,000
- For `docs`, use [`irds/gov2`](https://huggingface.co/datasets/irds/gov2)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/gov2_trec-tb-2006_efficiency_stream1', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Buttcher2006TrecTerabyte,
title={The TREC 2006 Terabyte Track},
author={Stefan B\"uttcher and Charles L. A. Clarke and Ian Soboroff},
booktitle={TREC},
year={2006}
}
```
|
dkoterwa/mlqa_filtered | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: id
dtype: string
- name: lang
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 46553350
num_examples: 41019
download_size: 24529110
dataset_size: 46553350
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# mlqa filtered version
For a better dataset description, please visit the official site of the source dataset: [LINK](https://huggingface.co/datasets/mlqa) <br>
<br>
**This dataset was prepared by converting mlqa dataset**. I've concatenated versions of the dataset for languages of interest and retrieved a text answers from "answers" column.
**I additionaly share the code which I used to convert the original dataset to make everything more clear**
```
def download_mlqa(subset_name):
dataset_valid = load_dataset("mlqa", subset_name, split="validation").to_pandas()
dataset_test = load_dataset("mlqa", subset_name, split="test").to_pandas()
full_dataset = pd.concat([dataset_valid, dataset_test])
full_dataset.reset_index(drop=True, inplace=True)
return full_dataset
needed_langs = ["mlqa.en.en", "mlqa.de.de", "mlqa.ar.ar", "mlqa.es.es", "mlqa.vi.vi", "mlqa.zh.zh"]
datasets = []
for lang in tqdm(needed_langs):
dataset = download_mlqa(lang)
dataset["lang"] = lang.split(".")[2]
datasets.append(dataset)
full_mlqa = pd.concat(datasets)
full_mlqa.reset_index(drop=True, inplace=True)
full_mlqa["answer"] = [answer_dict["text"][0] for answer_dict in full_mlqa["answers"]]
full_mlqa.drop("answers", axis=1, inplace=True)
```
**How to download**
```
from datasets import load_dataset
data = load_dataset("dkoterwa/mlqa_filtered")
``` |
GriddleDean/mangaupdates | ---
language:
- en
tags:
- manga
- tags
- genres
- scraped
size_categories:
- 100K<n<1M
---
I scraped [mangaupdates](https://www.mangaupdates.com) for a project and i am sharing the data. There is a tar file which contians the json response from every infos entry.
I parsed it and added it to a postgres database. The pgdump was uploaded too. There are some entries that do not exist anymore. It can be found in the removed ids json.
<details>
<summary>SQL structure</summary>
I didnt try to make it a optimal strucure, but i tried to remove the redundancy of strings.
### Info
```sql
create table info
(
id serial primary key,
private_id int,
public_id bigint not null,
forum_id bigint not null,
url_key text not null,
url_name text,
titles text[] not null,
description text,
image_name text,
typ int not null,
year int,
latest_chapter integer not null,
rating integer not null,
bayesian_rating float,
genres int[] not null,
tags int[] not null,
tags_upvotes int[] not null,
tags_downvotes int[] not null,
tags_uploader bigint[] not null,
status text,
licensed boolean not null,
completed boolean not null,
author int[] not null,
artist int[] not null,
publisher_original int[] not null,
publisher_english int[] not null,
publication text[] not null,
publication_publisher int[] not null,
relations text[] not null,
anime_start text,
anime_end text,
last_updated_mu TIMESTAMP,
last_updated TIMESTAMP not null,
created TIMESTAMP not null
);
```
### Types
```sql
create table if not exists mtypes
(
id serial primary key,
name text not null
);
```
### Genres
```sql
create table if not exists genres
(
id serial primary key,
name text not null
);
```
### Tags
```sql
create table if not exists tags
(
id serial primary key,
name text not null
);
```
### People
```sql
create table if not exists ppl
(
id serial primary key,
mu_id bigint,
name text not null
);
```
</details> |
RadRadboy/Yone-br | ---
license: openrail
---
|
open-llm-leaderboard/details_nvidia__OpenMath-Mistral-7B-v0.1-hf | ---
pretty_name: Evaluation run of nvidia/OpenMath-Mistral-7B-v0.1-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nvidia/OpenMath-Mistral-7B-v0.1-hf](https://huggingface.co/nvidia/OpenMath-Mistral-7B-v0.1-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nvidia__OpenMath-Mistral-7B-v0.1-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-19T05:35:41.230176](https://huggingface.co/datasets/open-llm-leaderboard/details_nvidia__OpenMath-Mistral-7B-v0.1-hf/blob/main/results_2024-02-19T05-35-41.230176.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5866401075348832,\n\
\ \"acc_stderr\": 0.032864145623173094,\n \"acc_norm\": 0.5972287449546527,\n\
\ \"acc_norm_stderr\": 0.0337465309246502,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.01602157061376854,\n \"mc2\": 0.46132143106665885,\n\
\ \"mc2_stderr\": 0.014976964923448676\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5571672354948806,\n \"acc_stderr\": 0.014515573873348907,\n\
\ \"acc_norm\": 0.5938566552901023,\n \"acc_norm_stderr\": 0.014351656690097862\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.629555865365465,\n\
\ \"acc_stderr\": 0.004819367172685957,\n \"acc_norm\": 0.8177653853813981,\n\
\ \"acc_norm_stderr\": 0.0038524881775539627\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.037786210790920566,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.037786210790920566\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557836,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557836\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245282,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245282\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.025106820660539757,\n\
\ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.025106820660539757\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0287420409039485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0287420409039485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"\
acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896079,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896079\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082394,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082394\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7854406130268199,\n\
\ \"acc_stderr\": 0.014680033956893346,\n \"acc_norm\": 0.7854406130268199,\n\
\ \"acc_norm_stderr\": 0.014680033956893346\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153176,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.288268156424581,\n\
\ \"acc_stderr\": 0.015149132860209432,\n \"acc_norm\": 0.288268156424581,\n\
\ \"acc_norm_stderr\": 0.015149132860209432\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.027184498909941613,\n\
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.027184498909941613\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464482,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464482\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.025910063528240868,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.025910063528240868\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41916558018252936,\n\
\ \"acc_stderr\": 0.012602244505788236,\n \"acc_norm\": 0.41916558018252936,\n\
\ \"acc_norm_stderr\": 0.012602244505788236\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6209150326797386,\n \"acc_stderr\": 0.01962744474841223,\n \
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.01962744474841223\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.01602157061376854,\n \"mc2\": 0.46132143106665885,\n\
\ \"mc2_stderr\": 0.014976964923448676\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091087\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225382\n }\n}\n```"
repo_url: https://huggingface.co/nvidia/OpenMath-Mistral-7B-v0.1-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|arc:challenge|25_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|gsm8k|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hellaswag|10_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T05-35-41.230176.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T05-35-41.230176.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- '**/details_harness|winogrande|5_2024-02-19T05-35-41.230176.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-19T05-35-41.230176.parquet'
- config_name: results
data_files:
- split: 2024_02_19T05_35_41.230176
path:
- results_2024-02-19T05-35-41.230176.parquet
- split: latest
path:
- results_2024-02-19T05-35-41.230176.parquet
---
# Dataset Card for Evaluation run of nvidia/OpenMath-Mistral-7B-v0.1-hf
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nvidia/OpenMath-Mistral-7B-v0.1-hf](https://huggingface.co/nvidia/OpenMath-Mistral-7B-v0.1-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nvidia__OpenMath-Mistral-7B-v0.1-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-19T05:35:41.230176](https://huggingface.co/datasets/open-llm-leaderboard/details_nvidia__OpenMath-Mistral-7B-v0.1-hf/blob/main/results_2024-02-19T05-35-41.230176.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5866401075348832,
"acc_stderr": 0.032864145623173094,
"acc_norm": 0.5972287449546527,
"acc_norm_stderr": 0.0337465309246502,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.01602157061376854,
"mc2": 0.46132143106665885,
"mc2_stderr": 0.014976964923448676
},
"harness|arc:challenge|25": {
"acc": 0.5571672354948806,
"acc_stderr": 0.014515573873348907,
"acc_norm": 0.5938566552901023,
"acc_norm_stderr": 0.014351656690097862
},
"harness|hellaswag|10": {
"acc": 0.629555865365465,
"acc_stderr": 0.004819367172685957,
"acc_norm": 0.8177653853813981,
"acc_norm_stderr": 0.0038524881775539627
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.037786210790920566,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.037786210790920566
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557836,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557836
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245282,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245282
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.025106820660539757,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.025106820660539757
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0287420409039485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0287420409039485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.01749392240411265,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.01749392240411265
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896079,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896079
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082394,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082394
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7854406130268199,
"acc_stderr": 0.014680033956893346,
"acc_norm": 0.7854406130268199,
"acc_norm_stderr": 0.014680033956893346
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153176,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.288268156424581,
"acc_stderr": 0.015149132860209432,
"acc_norm": 0.288268156424581,
"acc_norm_stderr": 0.015149132860209432
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.027184498909941613,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.027184498909941613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464482,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.025910063528240868,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.025910063528240868
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41916558018252936,
"acc_stderr": 0.012602244505788236,
"acc_norm": 0.41916558018252936,
"acc_norm_stderr": 0.012602244505788236
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5919117647058824,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.5919117647058824,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.01962744474841223,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.01962744474841223
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.01602157061376854,
"mc2": 0.46132143106665885,
"mc2_stderr": 0.014976964923448676
},
"harness|winogrande|5": {
"acc": 0.7726913970007893,
"acc_stderr": 0.011778612167091087
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225382
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
alvations/food-and-beverage | ---
license: cc-by-sa-4.0
---
This is a human translated from English list of food and beverage terms in multiple languages:
- English (source)
- German
- Japanese
- Arabic
- Korean
- French
- Spanish
- Vietnamese
- Aymara
- Nahuatl
- Thai
- Indonesian
- Malay
- Burmese
# Cite
> Liling Tan (2024) LexMT: An Analysis of Machine Translation for Learners Lexicon. https://huggingface.co/datasets/alvations/
```
@article{tan-2023-lexmt-bodyparts,
title = "LexMT: An Analysis of Machine Translation for Learners Lexicon",
author = "Tan, Liling",
journal = "alvations.com",
year = "2024",
month = "Feb",
url = "https://huggingface.co/datasets/alvations/"
}
``` |
AiBototicus/Wallet-Addresses | ---
license: unknown
---
|
Gurveer05/plant-promoter-sequences | ---
tags:
- biology
size_categories:
- 10M<n<100M
---
# Promoter Sequences for Various plant species
The data in this dataset has the promoter sequences for **241 different plant species** and has been used for the pretraining step of [`Florabert`](https://huggingface.co/Gurveer05/FloraBERT). It has been created by processing the raw fasta files and the gff3 / gff files from [`Ensembl`](https://plants.ensembl.org/) and [`Refseq`](https://www.ncbi.nlm.nih.gov/refseq/).
*samtools* and *bedtools* have been used to extract the promoter sequences from these that are 1kb upstream of the sequence.
The data has been split into train and test data (90-10 split). In all, there are ~ 10 million sequences across the split files. The steps followed to obtain this data are available in this [`Github Repository`](https://github.com/gurveervirk/florabert). |
hosnasn/dreambooth-hackathon-images | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1078145.0
num_examples: 24
download_size: 839878
dataset_size: 1078145.0
---
# Dataset Card for "dreambooth-hackathon-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alespalla/chatbot_instruction_prompts | ---
license: apache-2.0
dataset_info:
features:
- name: response
dtype: string
- name: prompt
dtype: string
splits:
- name: test
num_bytes: 24612503
num_examples: 64511
- name: train
num_bytes: 98485829
num_examples: 258042
download_size: 78591384
dataset_size: 123098332
task_categories:
- question-answering
- conversational
- text-generation
language:
- en
size_categories:
- 100K<n<1M
---
# Dataset Card for Chatbot Instruction Prompts Datasets
### Dataset Summary
This dataset has been generated from the following ones:
- `tatsu-lab/alpaca`
- `Dahoas/instruct-human-assistant-prompt`
- `allenai/prosocial-dialog`
The datasets has been cleaned up of spurious entries and artifacts. It contains ~500k of prompt and expected resposne. This DB is intended to train an instruct-type model
|
sproos/arxiv_embeddings_480k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: abstract
dtype: string
- name: embedding
sequence: float64
splits:
- name: train
num_bytes: 6351419194
num_examples: 481271
download_size: 6014930006
dataset_size: 6351419194
---
# Dataset Card for "arxiv_embeddings_480k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Linda2023New/newdata | ---
license: openrail
---
|
davehornik/diffusionDB_filtered_prompts | ---
license: mit
task_categories:
- tabular-classification
language:
- en
tags:
- art
pretty_name: diffusionDB Prompts - 1,8mil
size_categories:
- 1M<n<10M
---
# To get access, you'll need to provide compelling reason,
# you can contact me at: david.hornik@outlook.cz
# Since this database is WIP for future GPT learning.
Dataset made out of subset: "large_all" from https://huggingface.co/datasets/poloclub/diffusiondb.
Filtered from:
- duplicate lines
- blank lines
- lines with single word
- mathbold and similiar Unicode
- http/s links
- unprintable/binary characters
- NUL characters
- multiple whitespaces together
- special characters like: ( [ { @ # $ % < etc.
+
other small stuff |
thecrazyphysicist/mlc_train | ---
license: mit
---
|
autoevaluate/autoeval-staging-eval-project-emotion-af6a16fe-14025918 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: anindabitm/sagemaker-distilbert-emotion
metrics: []
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: anindabitm/sagemaker-distilbert-emotion
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
HydraLM/tiny-codes-standardized | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 3678763115
num_examples: 3264618
download_size: 1264753822
dataset_size: 3678763115
---
# Dataset Card for "tiny-codes-standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NarchAI1992/townhouse_interior | ---
license: openrail
---
|
vvtq/train_100_pose_excluded | ---
language: en
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 11259932.0
num_examples: 100
download_size: 11259068
dataset_size: 11259932.0
---
# Dataset Card for "train_100_pose_excluded"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sazirarrwth99/web_nlg_dev | ---
dataset_info:
features:
- name: id
dtype: string
- name: old_id
dtype: string
- name: text
dtype: string
- name: category
dtype: string
- name: size
dtype: string
- name: shape
dtype: string
- name: shape_type
dtype: string
- name: triplets
dtype: string
- name: question_entities
dtype: string
- name: superclasses
dtype: string
- name: triplets_subgraph
dtype: string
- name: superclasses_new_entities
dtype: string
- name: possible_classes
dtype: string
- name: possible_classes_no_comment
dtype: string
- name: possible_object_properties
dtype: string
- name: possible_object_properties_no_comment
dtype: string
- name: possible_data_properties
dtype: string
- name: possible_data_properties_no_comment
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 5327976
num_examples: 1613
download_size: 1340366
dataset_size: 5327976
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
datahrvoje/twitter_dataset_1712701791 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 16636
num_examples: 44
download_size: 14429
dataset_size: 16636
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/yume_minami_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yume_minami/南夢芽/南梦芽 (Azur Lane)
This is the dataset of yume_minami/南夢芽/南梦芽 (Azur Lane), containing 475 images and their tags.
The core tags of this character are `long_hair, green_eyes, brown_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 475 | 665.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yume_minami_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 475 | 344.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yume_minami_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1169 | 742.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yume_minami_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 475 | 571.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yume_minami_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1169 | 1.07 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yume_minami_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yume_minami_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, school_uniform, solo, blazer, collared_shirt, black_jacket, upper_body, white_shirt, looking_at_viewer, simple_background, long_sleeves, closed_mouth, sweater, white_background, open_jacket |
| 1 | 16 |  |  |  |  |  | 1girl, black_pantyhose, blazer, looking_at_viewer, solo, black_jacket, long_sleeves, pleated_skirt, school_uniform, black_skirt, closed_mouth, open_jacket, white_shirt, collared_shirt, miniskirt, sweater, standing, cowboy_shot, backpack |
| 2 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, pleated_skirt, school_uniform, solo, black_jacket, black_pantyhose, blazer, long_sleeves, backpack |
| 3 | 19 |  |  |  |  |  | 1girl, black_pantyhose, school_uniform, solo, looking_at_viewer, blazer, pleated_skirt, simple_background, white_background |
| 4 | 8 |  |  |  |  |  | 1girl, brown_footwear, long_sleeves, looking_at_viewer, school_uniform, black_pantyhose, loafers, pleated_skirt, solo, black_jacket, blazer, white_shirt, black_skirt, collared_shirt, closed_mouth, full_body, holding, sitting, white_background, blue_skirt, simple_background, squatting |
| 5 | 14 |  |  |  |  |  | looking_at_viewer, 1girl, necktie, pink_shirt, solo, black_skirt, pleated_skirt, school_uniform, smile, short_sleeves, simple_background, bracelet, white_background, collared_shirt, socks, brown_footwear, loafers, sitting |
| 6 | 30 |  |  |  |  |  | 1girl, looking_at_viewer, solo, medium_breasts, short_shorts, navel, cleavage, simple_background, collarbone, denim_shorts, pink_bikini, white_background, white_bikini, bare_shoulders, smile |
| 7 | 5 |  |  |  |  |  | detached_collar, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, wrist_cuffs, 1girl, rabbit_tail, solo, black_leotard, black_pantyhose, blush, green_background, strapless_leotard, bare_shoulders, black_bowtie, blonde_hair, cleavage, fake_tail, fishnets, high_heels, open_mouth, simple_background, small_breasts, smile, standing, white_background, white_footwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | school_uniform | solo | blazer | collared_shirt | black_jacket | upper_body | white_shirt | looking_at_viewer | simple_background | long_sleeves | closed_mouth | sweater | white_background | open_jacket | black_pantyhose | pleated_skirt | black_skirt | miniskirt | standing | cowboy_shot | backpack | brown_footwear | loafers | full_body | holding | sitting | blue_skirt | squatting | necktie | pink_shirt | smile | short_sleeves | bracelet | socks | medium_breasts | short_shorts | navel | cleavage | collarbone | denim_shorts | pink_bikini | white_bikini | bare_shoulders | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | wrist_cuffs | rabbit_tail | black_leotard | blush | green_background | strapless_leotard | black_bowtie | blonde_hair | fake_tail | fishnets | high_heels | open_mouth | small_breasts | white_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:---------|:-----------------|:---------------|:-------------|:--------------|:--------------------|:--------------------|:---------------|:---------------|:----------|:-------------------|:--------------|:------------------|:----------------|:--------------|:------------|:-----------|:--------------|:-----------|:-----------------|:----------|:------------|:----------|:----------|:-------------|:------------|:----------|:-------------|:--------|:----------------|:-----------|:--------|:-----------------|:---------------|:--------|:-----------|:-------------|:---------------|:--------------|:---------------|:-----------------|:------------------|:-------------------|:----------------|:--------------|:--------------|:--------------|:----------------|:--------|:-------------------|:--------------------|:---------------|:--------------|:------------|:-----------|:-------------|:-------------|:----------------|:-----------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 16 |  |  |  |  |  | X | X | X | X | X | X | | X | X | | X | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | | X | | | X | | X | | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 19 |  |  |  |  |  | X | X | X | X | | | | | X | X | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | X | X | | X | | X | X | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 14 |  |  |  |  |  | X | X | X | | X | | | | X | X | | | | X | | | X | X | | | | | X | X | | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 30 |  |  |  |  |  | X | | X | | | | | | X | X | | | | X | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | X | | | | | | X | X | | | | X | | X | | | | X | | | | | | | | | | | | X | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
stanmalkinson199/TonyKayMitchell | ---
license: openrail
---
|
open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-13B-v1 | ---
pretty_name: Evaluation run of jeonsworld/CarbonVillain-en-13B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jeonsworld/CarbonVillain-en-13B-v1](https://huggingface.co/jeonsworld/CarbonVillain-en-13B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-13B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T20:14:53.981182](https://huggingface.co/datasets/open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-13B-v1/blob/main/results_2023-12-29T20-14-53.981182.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6677851094474622,\n\
\ \"acc_stderr\": 0.031647346301320364,\n \"acc_norm\": 0.6687652386109932,\n\
\ \"acc_norm_stderr\": 0.032290288467975714,\n \"mc1\": 0.572827417380661,\n\
\ \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7197651592692368,\n\
\ \"mc2_stderr\": 0.014984462732010536\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n\
\ \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266125\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7148974307906791,\n\
\ \"acc_stderr\": 0.00450540617660685,\n \"acc_norm\": 0.8845847440748855,\n\
\ \"acc_norm_stderr\": 0.0031886940284536315\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5026455026455027,\n \"acc_stderr\": 0.02575094967813038,\n \"\
acc_norm\": 0.5026455026455027,\n \"acc_norm_stderr\": 0.02575094967813038\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8258064516129032,\n \"acc_stderr\": 0.021576248184514587,\n \"\
acc_norm\": 0.8258064516129032,\n \"acc_norm_stderr\": 0.021576248184514587\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03011768892950357,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03011768892950357\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n\
\ \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n\
\ \"acc_stderr\": 0.016376966142610073,\n \"acc_norm\": 0.39888268156424583,\n\
\ \"acc_norm_stderr\": 0.016376966142610073\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.02301670564026219,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.02301670564026219\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4934810951760104,\n\
\ \"acc_stderr\": 0.012769150688867503,\n \"acc_norm\": 0.4934810951760104,\n\
\ \"acc_norm_stderr\": 0.012769150688867503\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n\
\ \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7197651592692368,\n\
\ \"mc2_stderr\": 0.014984462732010536\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6429112964366944,\n \
\ \"acc_stderr\": 0.013197931775445206\n }\n}\n```"
repo_url: https://huggingface.co/jeonsworld/CarbonVillain-en-13B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|arc:challenge|25_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|gsm8k|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hellaswag|10_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T20-14-53.981182.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T20-14-53.981182.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- '**/details_harness|winogrande|5_2023-12-29T20-14-53.981182.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T20-14-53.981182.parquet'
- config_name: results
data_files:
- split: 2023_12_29T20_14_53.981182
path:
- results_2023-12-29T20-14-53.981182.parquet
- split: latest
path:
- results_2023-12-29T20-14-53.981182.parquet
---
# Dataset Card for Evaluation run of jeonsworld/CarbonVillain-en-13B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jeonsworld/CarbonVillain-en-13B-v1](https://huggingface.co/jeonsworld/CarbonVillain-en-13B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-13B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T20:14:53.981182](https://huggingface.co/datasets/open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-13B-v1/blob/main/results_2023-12-29T20-14-53.981182.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6677851094474622,
"acc_stderr": 0.031647346301320364,
"acc_norm": 0.6687652386109932,
"acc_norm_stderr": 0.032290288467975714,
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7197651592692368,
"mc2_stderr": 0.014984462732010536
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266125
},
"harness|hellaswag|10": {
"acc": 0.7148974307906791,
"acc_stderr": 0.00450540617660685,
"acc_norm": 0.8845847440748855,
"acc_norm_stderr": 0.0031886940284536315
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5026455026455027,
"acc_stderr": 0.02575094967813038,
"acc_norm": 0.5026455026455027,
"acc_norm_stderr": 0.02575094967813038
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514587,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514587
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03011768892950357,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03011768892950357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992005,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992005
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39888268156424583,
"acc_stderr": 0.016376966142610073,
"acc_norm": 0.39888268156424583,
"acc_norm_stderr": 0.016376966142610073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.02301670564026219,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.02301670564026219
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4934810951760104,
"acc_stderr": 0.012769150688867503,
"acc_norm": 0.4934810951760104,
"acc_norm_stderr": 0.012769150688867503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103128,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103128
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7197651592692368,
"mc2_stderr": 0.014984462732010536
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828075
},
"harness|gsm8k|5": {
"acc": 0.6429112964366944,
"acc_stderr": 0.013197931775445206
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
SilkGPT/Silk_fMRI_NSD | ---
license: cc0-1.0
---
|
DBQ/Net.a.Porter.Product.prices.Poland | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Poland - Net-a-Porter - Product-level price list
tags:
- webscraping
- ecommerce
- Net
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: int64
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 17622494
num_examples: 43226
download_size: 5517346
dataset_size: 17622494
---
# Net-a-Porter web scraped data
## About the website
The **EMEA industry**, particularly in **Poland**, where **Net-a-Porter** operates, is represented by the **online fashion retail sector**. This industry has faced significant growth in the last few years due to increased internet penetration and consumer preference for online shopping. Polish e-commerce market has drastically expanded, with a growing number of consumers becoming more comfortable in purchasing a variety of products online. The **dataset** observed pertains to the **Ecommerce product-list page (PLP) data on Net-a-Porter in Poland**. This data provides valuable insights into shopping trends, customer preferences, and overall performance of products on the online platform.
## Link to **dataset**
[Poland - Net-a-Porter - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Net-a-Porter%20Product-prices%20Poland/r/recreBSBJ60wTY54C)
|
CyberHarem/london_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of london/ロンドン/伦敦 (Azur Lane)
This is the dataset of london/ロンドン/伦敦 (Azur Lane), containing 77 images and their tags.
The core tags of this character are `glasses, long_hair, semi-rimless_eyewear, red_eyes, breasts, brown_hair, red-framed_eyewear, ahoge, earrings, hat, under-rim_eyewear, beret, bangs, large_breasts, two_side_up`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 77 | 93.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/london_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 77 | 56.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/london_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 171 | 110.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/london_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 77 | 83.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/london_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 171 | 152.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/london_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/london_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, smile, jewelry, pleated_skirt, solo, white_gloves, white_shirt, thigh_strap, blue_skirt, collared_shirt, white_background, closed_mouth, high-waist_skirt, multicolored_hair, side-tie_panties, simple_background, blush, kneehighs, necktie, open_jacket, blue_ribbon, brown_jacket, hair_ornament, holding, medium_breasts, neck_ribbon, rigging, white_jacket, wind_lift |
| 1 | 9 |  |  |  |  |  | 1girl, jewelry, looking_at_viewer, necktie, solo, retrofit_(azur_lane), smile, long_sleeves, simple_background, dress, open_mouth, pantyhose, white_background, blue_headwear, blush, center_frills, hair_between_eyes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | looking_at_viewer | smile | jewelry | pleated_skirt | solo | white_gloves | white_shirt | thigh_strap | blue_skirt | collared_shirt | white_background | closed_mouth | high-waist_skirt | multicolored_hair | side-tie_panties | simple_background | blush | kneehighs | necktie | open_jacket | blue_ribbon | brown_jacket | hair_ornament | holding | medium_breasts | neck_ribbon | rigging | white_jacket | wind_lift | retrofit_(azur_lane) | dress | open_mouth | pantyhose | blue_headwear | center_frills | hair_between_eyes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:--------|:----------|:----------------|:-------|:---------------|:--------------|:--------------|:-------------|:-----------------|:-------------------|:---------------|:-------------------|:--------------------|:-------------------|:--------------------|:--------|:------------|:----------|:--------------|:--------------|:---------------|:----------------|:----------|:-----------------|:--------------|:----------|:---------------|:------------|:-----------------------|:--------|:-------------|:------------|:----------------|:----------------|:--------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | | X | | | | | | X | | | | | X | X | | X | | | | | | | | | | | X | X | X | X | X | X | X |
|
tjspross/ctb6 | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: cws_tags
sequence:
class_label:
names:
'0': B-SEG
'1': M-SEG
'2': E-SEG
'3': S-SEG
splits:
- name: train
num_bytes: 16019709
num_examples: 23458
- name: test
num_bytes: 2033816
num_examples: 2796
- name: dev
num_bytes: 1520164
num_examples: 2079
download_size: 2580231
dataset_size: 19573689
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
---
|
arzumanabbasov/az-banks-customers-instagram-comments-and-answers | ---
license: mit
task_categories:
- text-generation
- text2text-generation
- question-answering
language:
- az
tags:
- banking
- customer_reviews
pretty_name: Azerbaijani Banks Customer Instagram Comments and Banks Answers
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [Arzuman Abbasov]
- **Funded by [optional]:** [Arzuman Abbasov]
- **Shared by [optional]:** [Arzuman Abbasov]
- **Language(s) (NLP):** [Azerbaijani]
- **License:** [MIT]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pphuc25/VLSP_T2 | ---
language: vi
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 689551911.12
num_examples: 18843
download_size: 693488600
dataset_size: 689551911.12
---
# Dataset Card for "VLSP_T2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GEM-submissions/lewtun__this-is-a-test-submission-1__1656014763 | ---
benchmark: gem
type: prediction
submission_name: This is a test submission 1
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: This is a test submission 1
|
benayas/snips_chatgpt_20pct_v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1044286
num_examples: 13084
download_size: 416032
dataset_size: 1044286
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AppleHarem/haze_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of haze (Arknights)
This is the dataset of haze (Arknights), containing 73 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 73 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 163 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 175 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 73 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 73 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 73 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 163 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 163 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 82 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 175 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 175 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
autoevaluate/autoeval-staging-eval-project-Blaise-g__SumPubmed-c8bf564e-12335645 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- Blaise-g/SumPubmed
eval_info:
task: summarization
model: Blaise-g/led_pubmed_sumpubmed_5
metrics: ['bertscore']
dataset_name: Blaise-g/SumPubmed
dataset_config: Blaise-g--SumPubmed
dataset_split: test
col_mapping:
text: text
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: Blaise-g/led_pubmed_sumpubmed_5
* Dataset: Blaise-g/SumPubmed
* Config: Blaise-g--SumPubmed
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Blaise-g](https://huggingface.co/Blaise-g) for evaluating this model. |
Sam172/Patents48448 | ---
license: bigscience-openrail-m
---
|
quyennt/nq42_faq | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 29893
num_examples: 78
download_size: 13850
dataset_size: 29893
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
louisbrulenaudet/code-defense | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code de la défense
source_datasets:
- original
pretty_name: Code de la défense
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code de la défense, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
CyberHarem/saki_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of saki/空井サキ/咲 (Blue Archive)
This is the dataset of saki/空井サキ/咲 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `short_hair, halo, breasts, purple_hair, large_breasts, hat, blue_eyes, bucket_hat, black_headwear, green_halo`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 858.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saki_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 701.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saki_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1287 | 1.46 GiB | [Download](https://huggingface.co/datasets/CyberHarem/saki_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/saki_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, black_gloves, blue_sailor_collar, helmet, long_sleeves, solo, yellow_neckerchief, looking_at_viewer, pleated_skirt, animal_ears, blue_serafuku, blue_skirt, knee_pads, open_mouth, black_hair, light_machine_gun, blush, green_eyes, white_skirt, black_footwear, boots, holding_gun |
| 1 | 9 |  |  |  |  |  | 1boy, 1girl, blush, hetero, long_sleeves, solo_focus, blue_sailor_collar, penis, black_gloves, nipples, open_mouth, yellow_neckerchief, helmet, looking_at_viewer, paizuri, serafuku, bar_censor, cum, pov_crotch, shirt_lift, sweat |
| 2 | 34 |  |  |  |  |  | 1girl, long_sleeves, official_alternate_costume, raglan_sleeves, rash_guard, solo, navel, stomach, blue_bikini, looking_at_viewer, bikini_bottom_only, blush, simple_background, white_background, closed_mouth, thighs, cowboy_shot, cropped_jacket |
| 3 | 12 |  |  |  |  |  | 1girl, blue_bikini, blue_sky, day, long_sleeves, official_alternate_costume, outdoors, raglan_sleeves, solo, blush, looking_at_viewer, navel, rash_guard, stomach, cloud, bikini_bottom_only, cowboy_shot, cropped_jacket, thighs, utility_belt, belt_pouch, closed_mouth, duffel_bag, green_eyes, open_mouth, smile |
| 4 | 6 |  |  |  |  |  | short_sleeves, white_shirt, 1girl, blue_apron, solo, blush, collared_shirt, helmet, looking_at_viewer, official_alternate_costume, black_bowtie, blurry_background, white_headwear |
| 5 | 6 |  |  |  |  |  | 1girl, alternate_costume, fake_animal_ears, looking_at_viewer, rabbit_ears, bowtie, cleavage, detached_collar, pantyhose, playboy_bunny, simple_background, solo, wrist_cuffs, bare_shoulders, blush, green_eyes, strapless_leotard, blue_leotard, closed_mouth, covered_navel, open_mouth, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | blue_sailor_collar | helmet | long_sleeves | solo | yellow_neckerchief | looking_at_viewer | pleated_skirt | animal_ears | blue_serafuku | blue_skirt | knee_pads | open_mouth | black_hair | light_machine_gun | blush | green_eyes | white_skirt | black_footwear | boots | holding_gun | 1boy | hetero | solo_focus | penis | nipples | paizuri | serafuku | bar_censor | cum | pov_crotch | shirt_lift | sweat | official_alternate_costume | raglan_sleeves | rash_guard | navel | stomach | blue_bikini | bikini_bottom_only | simple_background | white_background | closed_mouth | thighs | cowboy_shot | cropped_jacket | blue_sky | day | outdoors | cloud | utility_belt | belt_pouch | duffel_bag | smile | short_sleeves | white_shirt | blue_apron | collared_shirt | black_bowtie | blurry_background | white_headwear | alternate_costume | fake_animal_ears | rabbit_ears | bowtie | cleavage | detached_collar | pantyhose | playboy_bunny | wrist_cuffs | bare_shoulders | strapless_leotard | blue_leotard | covered_navel |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:---------------------|:---------|:---------------|:-------|:---------------------|:--------------------|:----------------|:--------------|:----------------|:-------------|:------------|:-------------|:-------------|:--------------------|:--------|:-------------|:--------------|:-----------------|:--------|:--------------|:-------|:---------|:-------------|:--------|:----------|:----------|:-----------|:-------------|:------|:-------------|:-------------|:--------|:-----------------------------|:-----------------|:-------------|:--------|:----------|:--------------|:---------------------|:--------------------|:-------------------|:---------------|:---------|:--------------|:-----------------|:-----------|:------|:-----------|:--------|:---------------|:-------------|:-------------|:--------|:----------------|:--------------|:-------------|:-----------------|:---------------|:--------------------|:-----------------|:--------------------|:-------------------|:--------------|:---------|:-----------|:------------------|:------------|:----------------|:--------------|:-----------------|:--------------------|:---------------|:----------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | | X | X | | | | | | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 34 |  |  |  |  |  | X | | | | X | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | | | | X | X | | X | | | | | | X | | | X | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | | X | | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | | | X | | X | | | | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
The-F00L/charges_ | ---
license: mit
---
|
Multimodal-Fatima/FGVC_Aircraft_test_facebook_opt_350m_Visclues_ns_3333_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 300686732.375
num_examples: 3333
- name: fewshot_3_bs_16
num_bytes: 302943471.375
num_examples: 3333
download_size: 595742250
dataset_size: 603630203.75
---
# Dataset Card for "FGVC_Aircraft_test_facebook_opt_350m_Visclues_ns_3333_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Back-up/test_ds | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: response
struct:
- name: response
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: instruction
dtype: string
- name: prompt_name
dtype: string
- name: metadata
struct:
- name: max_ratio
dtype: float64
- name: paragraph_similar
dtype: string
- name: start_index
dtype: int64
splits:
- name: train
num_bytes: 21511872
num_examples: 7597
download_size: 0
dataset_size: 21511872
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SpongeBash/bash_images_2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 146725.0
num_examples: 12
download_size: 148375
dataset_size: 146725.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "bash_images_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MicPie/unpredictable_5k | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-5k
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-5k" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
HossainRabby/DATA | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 379777.2888616891
num_examples: 735
- name: test
num_bytes: 42369.71113831089
num_examples: 82
download_size: 165978
dataset_size: 422147.0
---
# Dataset Card for "DATA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cbasconc/instructions_Device | ---
language:
- es
pretty_name: devices_clasification
--- |
AhmadMustafa/Urdu-Instruct-News-Headline-Generation | ---
language:
- ur
size_categories:
- 100K<n<1M
task_categories:
- text-generation
- summarization
pretty_name: Urdu Instruct News Headline Generation
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
- name: template_lang
sequence: string
splits:
- name: train
num_bytes: 255002720
num_examples: 100674
- name: test
num_bytes: 28284699
num_examples: 11187
download_size: 121546528
dataset_size: 283287419
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "Urdu-Instruct-News-Headline-Generation"
This Dataset is converted from the [original dataset](https://data.mendeley.com/datasets/834vsxnb99/3) by Khalid Hussain, Nimra Mughal, Irfan Ali, Saif Hassan, Sher Muhammad Daudpota.
## Task:
Generate the News Headline from the given News.
## Split Size:
- train: 100674
- test: 11187
## Prompt Template (In Urdu):
Random.choice b.w these 2. The first template is **template_id** 1, and the second template is **template_id** 2 in the dataset.
```
["اس اردو پیراگراف (خبروں) کا عنوان تجویز کریں
پیراگراف: {}",
"دیے گے خبروں کا عنوان تجویز کریں.
جملے: {}"
]
```
<b>Translation</b>:
```
1. Write a title for the following news article:
paragraph: {}
2. Suggest the title of the given the sentences
sentences: {}
```
## Completion Template (In Urdu)
```
جی ضرور، یہ رہا آپ کے پیراگراف کا عنوان:
{}
```
<b>Translation</b>:
```
Sure, here is the title of the given article
{}
``` |
DarqueDante/minihercules | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 3482636670
num_examples: 1637895
download_size: 1792443326
dataset_size: 3482636670
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sidmanale643/medGENIUS | ---
task_categories:
- question-answering
language:
- en
tags:
- medical
size_categories:
- n<1K
--- |
allennghayoui/mistral-chat-code-assistant-new-prompt | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 277684.1458333333
num_examples: 172
- name: test
num_bytes: 16144.427083333334
num_examples: 10
- name: validation
num_bytes: 16144.427083333334
num_examples: 10
download_size: 98727
dataset_size: 309972.99999999994
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_SanjiWatsuki__openchat-3.5-1210-starling-slerp | ---
pretty_name: Evaluation run of SanjiWatsuki/openchat-3.5-1210-starling-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SanjiWatsuki/openchat-3.5-1210-starling-slerp](https://huggingface.co/SanjiWatsuki/openchat-3.5-1210-starling-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SanjiWatsuki__openchat-3.5-1210-starling-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-27T12:59:24.501037](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__openchat-3.5-1210-starling-slerp/blob/main/results_2023-12-27T12-59-24.501037.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6530257167647132,\n\
\ \"acc_stderr\": 0.031906035120016406,\n \"acc_norm\": 0.6537129491623167,\n\
\ \"acc_norm_stderr\": 0.03255707134476186,\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.01655716732251688,\n \"mc2\": 0.4992288323014176,\n\
\ \"mc2_stderr\": 0.015334932030447291\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6075085324232082,\n \"acc_stderr\": 0.014269634635670728,\n\
\ \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175452\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6713802031467835,\n\
\ \"acc_stderr\": 0.004687514708345319,\n \"acc_norm\": 0.8527185819557856,\n\
\ \"acc_norm_stderr\": 0.003536619673019997\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400352,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400352\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.023025899617188716,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.023025899617188716\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6897435897435897,\n \"acc_stderr\": 0.02345467488940429,\n \
\ \"acc_norm\": 0.6897435897435897,\n \"acc_norm_stderr\": 0.02345467488940429\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660836,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660836\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553353,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553353\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503234,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503234\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7219730941704036,\n\
\ \"acc_stderr\": 0.030069584874494036,\n \"acc_norm\": 0.7219730941704036,\n\
\ \"acc_norm_stderr\": 0.030069584874494036\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709695,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709695\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990925,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990925\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903338,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903338\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36983240223463687,\n\
\ \"acc_stderr\": 0.016145881256056215,\n \"acc_norm\": 0.36983240223463687,\n\
\ \"acc_norm_stderr\": 0.016145881256056215\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4830508474576271,\n\
\ \"acc_stderr\": 0.012762896889210855,\n \"acc_norm\": 0.4830508474576271,\n\
\ \"acc_norm_stderr\": 0.012762896889210855\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144714,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144714\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.01655716732251688,\n \"mc2\": 0.4992288323014176,\n\
\ \"mc2_stderr\": 0.015334932030447291\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6702047005307051,\n \
\ \"acc_stderr\": 0.012949955030571149\n }\n}\n```"
repo_url: https://huggingface.co/SanjiWatsuki/openchat-3.5-1210-starling-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|arc:challenge|25_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|gsm8k|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hellaswag|10_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T12-59-24.501037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-27T12-59-24.501037.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- '**/details_harness|winogrande|5_2023-12-27T12-59-24.501037.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-27T12-59-24.501037.parquet'
- config_name: results
data_files:
- split: 2023_12_27T12_59_24.501037
path:
- results_2023-12-27T12-59-24.501037.parquet
- split: latest
path:
- results_2023-12-27T12-59-24.501037.parquet
---
# Dataset Card for Evaluation run of SanjiWatsuki/openchat-3.5-1210-starling-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SanjiWatsuki/openchat-3.5-1210-starling-slerp](https://huggingface.co/SanjiWatsuki/openchat-3.5-1210-starling-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SanjiWatsuki__openchat-3.5-1210-starling-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-27T12:59:24.501037](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__openchat-3.5-1210-starling-slerp/blob/main/results_2023-12-27T12-59-24.501037.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6530257167647132,
"acc_stderr": 0.031906035120016406,
"acc_norm": 0.6537129491623167,
"acc_norm_stderr": 0.03255707134476186,
"mc1": 0.3378212974296206,
"mc1_stderr": 0.01655716732251688,
"mc2": 0.4992288323014176,
"mc2_stderr": 0.015334932030447291
},
"harness|arc:challenge|25": {
"acc": 0.6075085324232082,
"acc_stderr": 0.014269634635670728,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175452
},
"harness|hellaswag|10": {
"acc": 0.6713802031467835,
"acc_stderr": 0.004687514708345319,
"acc_norm": 0.8527185819557856,
"acc_norm_stderr": 0.003536619673019997
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400352,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400352
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188716,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188716
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6897435897435897,
"acc_stderr": 0.02345467488940429,
"acc_norm": 0.6897435897435897,
"acc_norm_stderr": 0.02345467488940429
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660836,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660836
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553353,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553353
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503234,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503234
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7219730941704036,
"acc_stderr": 0.030069584874494036,
"acc_norm": 0.7219730941704036,
"acc_norm_stderr": 0.030069584874494036
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709695,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709695
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990925,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990925
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903338,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903338
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36983240223463687,
"acc_stderr": 0.016145881256056215,
"acc_norm": 0.36983240223463687,
"acc_norm_stderr": 0.016145881256056215
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4830508474576271,
"acc_stderr": 0.012762896889210855,
"acc_norm": 0.4830508474576271,
"acc_norm_stderr": 0.012762896889210855
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144714,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144714
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3378212974296206,
"mc1_stderr": 0.01655716732251688,
"mc2": 0.4992288323014176,
"mc2_stderr": 0.015334932030447291
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
},
"harness|gsm8k|5": {
"acc": 0.6702047005307051,
"acc_stderr": 0.012949955030571149
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sdiazlor/evol-test-3.5 | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for evol-test-3.5
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("sdiazlor/evol-test-3.5")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("sdiazlor/evol-test-3.5")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| input | input | text | True | True |
| instructions | instructions | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| instruction-rating | How would you rate the generated instruction? | rating | True | N/A | [1, 2, 3, 4, 5, 6, 7, 8, 9, 10] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
**✨ NEW** The **vectors** are different columns that contain a vector in floating point, which is constraint to the pre-defined dimensions in the **vectors_settings** when configuring the vectors within the dataset itself, also the dimensions will always be 1-dimensional. The **vectors** are optional and identified by the pre-defined vector name in the dataset configuration file in `argilla.yaml`.
| Vector Name | Title | Dimensions |
|-------------|-------|------------|
| input | input | [1, 384] |
| instructions | instructions | [1, 384] |
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
| length-input | length-input | integer | None - None | True |
| length-instruction | length-instruction | integer | None - None | True |
| input_n_tokens | Input N Tokens | integer | None - None | True |
| input_n_unique_tokens | Input N Unique Tokens | integer | None - None | True |
| input_n_sentences | Input N Sentences | integer | None - None | True |
| input_perplexity | Input Perplexity | float | None - None | True |
| input_entropy | Input Entropy | float | None - None | True |
| input_flesch_reading_ease | Input Flesch Reading Ease | float | None - None | True |
| instructions_n_tokens | Instructions N Tokens | integer | None - None | True |
| instructions_n_unique_tokens | Instructions N Unique Tokens | integer | None - None | True |
| instructions_n_sentences | Instructions N Sentences | integer | None - None | True |
| instructions_perplexity | Instructions Perplexity | float | None - None | True |
| instructions_entropy | Instructions Entropy | float | None - None | True |
| instructions_flesch_reading_ease | Instructions Flesch Reading Ease | float | None - None | True |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"input": "Choices:\n+ Yes.\n+ No.\nQ: Title: The Gate:to the mind\u0027s eye is number 1, Into the Mind\u0027s eye is number 2, turbulence is 3 Review: all i can say is that \"The Gate:to the mind\u0027s eye\" is the best computer animation trip i have ever experienced still to this day in 2009 and it was made in 1994. Thomas Dolby does a score to the film that\u0027s half trance and half Pink Floyd sounding. It is by far the best film and soundtrack out of all of the mind\u0027s eye series. This Turbulence is okay during one scene which is really trippy, but it\u0027s very very short and altogether i was unhappy when it was over. I\u0027d like to compliment anyone who had anything to do with the making of it though. It did take some talent to make i\u0027ll give it that. Just poorly arranged and should not be sold in stores in a box with art on the cover. someone made alot of money on this back in the day and it just doesn\u0027t hold a candle to the 1994 one i was talking about. Is this product review negative?\nA:",
"instructions": "Choices:\n+ Yes.\n+ No.\n+ I cannot determine from the given information.\nQ: Title: The Gate:to the mind\u0027s eye is number 1, Into the Mind\u0027s eye is number 2, turbulence is 3 Review: all i can say is that \"The Gate:to the mind\u0027s eye\" is the best computer animation trip i have ever experienced still to this day in 2009 and it was made in 1994. Thomas Dolby does a score to the film that\u0027s half trance and half Pink Floyd sounding. It is by far the best film and soundtrack out of all of the mind\u0027s eye series. This Turbulence is okay during one scene which is really trippy, but it\u0027s very very short and altogether i was unhappy when it was over. I\u0027d like to compliment anyone who had anything to do with the making of it though. It did take some talent to make i\u0027ll give it that. Just poorly arranged and should not be sold in stores in a box with art on the cover. someone made alot of money on this back in the day and it just doesn\u0027t hold a candle to the 1994 one i was talking about. Is this product review negative? Please provide a detailed explanation for your answer, considering both the positive and negative aspects mentioned in the review."
},
"metadata": {
"generation-model": [
"gpt-3.5-turbo"
],
"input_entropy": 7.15,
"input_flesch_reading_ease": 93.78,
"input_n_sentences": 14,
"input_n_tokens": 200,
"input_n_unique_tokens": 111,
"input_perplexity": 1270.39,
"instructions_entropy": 8.24,
"instructions_flesch_reading_ease": 87.71,
"instructions_n_sentences": 16,
"instructions_n_tokens": 227,
"instructions_n_unique_tokens": 127,
"instructions_perplexity": 3787.28,
"length-input": 971,
"length-instructions": 1148
},
"responses": [],
"suggestions": [],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"input": "Choices:\n+ Yes.\n+ No.\nQ: Title: The Gate:to the mind\u0027s eye is number 1, Into the Mind\u0027s eye is number 2, turbulence is 3 Review: all i can say is that \"The Gate:to the mind\u0027s eye\" is the best computer animation trip i have ever experienced still to this day in 2009 and it was made in 1994. Thomas Dolby does a score to the film that\u0027s half trance and half Pink Floyd sounding. It is by far the best film and soundtrack out of all of the mind\u0027s eye series. This Turbulence is okay during one scene which is really trippy, but it\u0027s very very short and altogether i was unhappy when it was over. I\u0027d like to compliment anyone who had anything to do with the making of it though. It did take some talent to make i\u0027ll give it that. Just poorly arranged and should not be sold in stores in a box with art on the cover. someone made alot of money on this back in the day and it just doesn\u0027t hold a candle to the 1994 one i was talking about. Is this product review negative?\nA:",
"instruction-rating": [],
"instruction-rating-suggestion": null,
"instruction-rating-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"instructions": "Choices:\n+ Yes.\n+ No.\n+ I cannot determine from the given information.\nQ: Title: The Gate:to the mind\u0027s eye is number 1, Into the Mind\u0027s eye is number 2, turbulence is 3 Review: all i can say is that \"The Gate:to the mind\u0027s eye\" is the best computer animation trip i have ever experienced still to this day in 2009 and it was made in 1994. Thomas Dolby does a score to the film that\u0027s half trance and half Pink Floyd sounding. It is by far the best film and soundtrack out of all of the mind\u0027s eye series. This Turbulence is okay during one scene which is really trippy, but it\u0027s very very short and altogether i was unhappy when it was over. I\u0027d like to compliment anyone who had anything to do with the making of it though. It did take some talent to make i\u0027ll give it that. Just poorly arranged and should not be sold in stores in a box with art on the cover. someone made alot of money on this back in the day and it just doesn\u0027t hold a candle to the 1994 one i was talking about. Is this product review negative? Please provide a detailed explanation for your answer, considering both the positive and negative aspects mentioned in the review.",
"metadata": "{\"length-input\": 971, \"length-instructions\": 1148, \"generation-model\": [\"gpt-3.5-turbo\"], \"input_n_tokens\": 200, \"input_n_unique_tokens\": 111, \"input_n_sentences\": 14, \"input_perplexity\": 1270.39, \"input_entropy\": 7.15, \"input_flesch_reading_ease\": 93.78, \"instructions_n_tokens\": 227, \"instructions_n_unique_tokens\": 127, \"instructions_n_sentences\": 16, \"instructions_perplexity\": 3787.28, \"instructions_entropy\": 8.24, \"instructions_flesch_reading_ease\": 87.71}",
"vectors": {
"input": null,
"instructions": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **input** is of type `text`.
* **instructions** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **instruction-rating** is of type `rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **instruction-rating-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].
* **✨ NEW** **Vectors**: As of Argilla 1.19.0, the vectors have been included in order to add support for similarity search to explore similar records based on vector search powered by the search engine defined. The vectors are optional and cannot be seen within the UI, those are uploaded and internally used. Also the vectors will always be optional, and only the dimensions previously defined in their settings.
* (optional) **input** is of type `float32` and has a dimension of (1, `384`).
* (optional) **instructions** is of type `float32` and has a dimension of (1, `384`).
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Y11IC/yy_ic_mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4153564
num_examples: 1000
download_size: 2242125
dataset_size: 4153564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_mayacinka__West-Ramen-7Bx4 | ---
pretty_name: Evaluation run of mayacinka/West-Ramen-7Bx4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mayacinka/West-Ramen-7Bx4](https://huggingface.co/mayacinka/West-Ramen-7Bx4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mayacinka__West-Ramen-7Bx4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T04:12:32.071780](https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__West-Ramen-7Bx4/blob/main/results_2024-03-01T04-12-32.071780.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.629685276004943,\n\
\ \"acc_stderr\": 0.03266326963288471,\n \"acc_norm\": 0.6316308215542414,\n\
\ \"acc_norm_stderr\": 0.033325469709942476,\n \"mc1\": 0.44430844553243576,\n\
\ \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6100230244317076,\n\
\ \"mc2_stderr\": 0.015368475683442384\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759088,\n\
\ \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518826\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6571400119498108,\n\
\ \"acc_stderr\": 0.004736950810617789,\n \"acc_norm\": 0.8552081258713403,\n\
\ \"acc_norm_stderr\": 0.0035117170854519868\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.037827289808654685,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.037827289808654685\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.02737987122994324,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.02737987122994324\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.0245375915728305,\n \
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.0245375915728305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.01591955782997604,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.01591955782997604\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.01987565502786746,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.01987565502786746\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899134,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899134\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4145251396648045,\n\
\ \"acc_stderr\": 0.016476342210254,\n \"acc_norm\": 0.4145251396648045,\n\
\ \"acc_norm_stderr\": 0.016476342210254\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242553,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242553\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45241199478487615,\n\
\ \"acc_stderr\": 0.012712265105889135,\n \"acc_norm\": 0.45241199478487615,\n\
\ \"acc_norm_stderr\": 0.012712265105889135\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.02895975519682487,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.02895975519682487\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268815,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268815\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44430844553243576,\n\
\ \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6100230244317076,\n\
\ \"mc2_stderr\": 0.015368475683442384\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5799848369977255,\n \
\ \"acc_stderr\": 0.01359512168852048\n }\n}\n```"
repo_url: https://huggingface.co/mayacinka/West-Ramen-7Bx4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|arc:challenge|25_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|gsm8k|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hellaswag|10_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T04-12-32.071780.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T04-12-32.071780.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- '**/details_harness|winogrande|5_2024-03-01T04-12-32.071780.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T04-12-32.071780.parquet'
- config_name: results
data_files:
- split: 2024_03_01T04_12_32.071780
path:
- results_2024-03-01T04-12-32.071780.parquet
- split: latest
path:
- results_2024-03-01T04-12-32.071780.parquet
---
# Dataset Card for Evaluation run of mayacinka/West-Ramen-7Bx4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mayacinka/West-Ramen-7Bx4](https://huggingface.co/mayacinka/West-Ramen-7Bx4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mayacinka__West-Ramen-7Bx4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T04:12:32.071780](https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__West-Ramen-7Bx4/blob/main/results_2024-03-01T04-12-32.071780.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.629685276004943,
"acc_stderr": 0.03266326963288471,
"acc_norm": 0.6316308215542414,
"acc_norm_stderr": 0.033325469709942476,
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6100230244317076,
"mc2_stderr": 0.015368475683442384
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759088,
"acc_norm": 0.6757679180887372,
"acc_norm_stderr": 0.013678810399518826
},
"harness|hellaswag|10": {
"acc": 0.6571400119498108,
"acc_stderr": 0.004736950810617789,
"acc_norm": 0.8552081258713403,
"acc_norm_stderr": 0.0035117170854519868
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.037827289808654685,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.037827289808654685
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.02737987122994324,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.02737987122994324
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.0245375915728305,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.0245375915728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.01591955782997604,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.01591955782997604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917669,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.01987565502786746,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.01987565502786746
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899134,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899134
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4145251396648045,
"acc_stderr": 0.016476342210254,
"acc_norm": 0.4145251396648045,
"acc_norm_stderr": 0.016476342210254
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242553,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242553
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45241199478487615,
"acc_stderr": 0.012712265105889135,
"acc_norm": 0.45241199478487615,
"acc_norm_stderr": 0.012712265105889135
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.02895975519682487,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.02895975519682487
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268815,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268815
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6100230244317076,
"mc2_stderr": 0.015368475683442384
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435091
},
"harness|gsm8k|5": {
"acc": 0.5799848369977255,
"acc_stderr": 0.01359512168852048
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
crumb/Open-Orca-k16 | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 1796489136
num_examples: 994896
download_size: 1023054925
dataset_size: 1796489136
---
# Dataset Card for "Open-Orca-k16"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Back-up/chung-khoan-demo-p8 | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: view
struct:
- name: number_of_response
dtype: string
- name: number_of_view
dtype: string
- name: content
list:
- name: res
dtype: string
splits:
- name: train
num_bytes: 62379038
num_examples: 13672
download_size: 22152958
dataset_size: 62379038
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/chibana_sumika_watashinoyuriwaoshigotodesu | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Chibana Sumika
This is the dataset of Chibana Sumika, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 681 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 681 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 681 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 681 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
xaviviro/oasst1_ca_chatml | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: validation
num_bytes: 494618
num_examples: 517
- name: train
num_bytes: 9341938
num_examples: 9841
download_size: 4984795
dataset_size: 9836556
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
language:
- ca
--- |
StephanAkkerman/financial-tweets-crypto | ---
license: mit
task_categories:
- text-classification
tags:
- tweet
- tweets
- finance
- crypto
- fintwit
---
# Financial Tweets - Cryptocurrency
This dataset is part of the scraped financial tweets that I collected from a variety of financial influencers on Twitter, all the datasets can be found here:
- Crypto: https://huggingface.co/datasets/StephanAkkerman/financial-tweets-crypto
- Stocks (and forex): https://huggingface.co/datasets/StephanAkkerman/financial-tweets-stocks
- Other (Tweet without cash tags): https://huggingface.co/datasets/StephanAkkerman/financial-tweets-other
## Data Fields
The data fields are as follows:
* `timestap`: The time the tweet was sent.
* `tweet_text`: All of the text of the tweet, including quoted tweets (prefixed with `>`).
* `tweet_url`: The URL of the tweet.
* `tweet_type`: The type of tweet, this can be tweet, retweet, or quote tweet.
* `price_of_ticker`: The price of the tickers mentioned.
* `change_of_ticker`: The 24h price change of the tickers.
* `tickers_mentioned`: All the tickers that are mentioned in the tweet.
* `category`: What type of category the tweet has, the suffix `_images` means that the tweet included an image.
|
graredcr/test1 | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
--- |
BrunoGR/Emo_support | ---
language:
- es
license: apache-2.0
size_categories:
- 100K<n<1M
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: texto
dtype: string
- name: etiqueta
dtype: string
splits:
- name: test
num_bytes: 3896475
num_examples: 27445
- name: train
num_bytes: 15548629
num_examples: 112347
- name: validation
num_bytes: 280112
num_examples: 2001
download_size: 10339604
dataset_size: 19725216
---
# Dataset Card for "Emo_support"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/alpaca_farm-reward-model-deberta-v3-large-v2-re-preference-256-nsample-4 | ---
dataset_info:
config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
splits:
- name: preference
num_bytes: 58429598
num_examples: 20001
download_size: 26745865
dataset_size: 58429598
configs:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: preference
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/preference-*
---
|
autoevaluate/autoeval-staging-eval-glue-mrpc-e15d1b-14666001 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: natural_language_inference
model: sgugger/bert-finetuned-mrpc
metrics: []
dataset_name: glue
dataset_config: mrpc
dataset_split: validation
col_mapping:
text1: sentence1
text2: sentence2
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Natural Language Inference
* Model: sgugger/bert-finetuned-mrpc
* Dataset: glue
* Config: mrpc
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
CyberHarem/sengoku_chihiro_sakurasounopetnakanojo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Sengoku Chihiro (Sakurasou no Pet na Kanojo)
This is the dataset of Sengoku Chihiro (Sakurasou no Pet na Kanojo), containing 71 images and their tags.
The core tags of this character are `short_hair, red_hair, brown_hair, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 71 | 64.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sengoku_chihiro_sakurasounopetnakanojo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 71 | 50.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sengoku_chihiro_sakurasounopetnakanojo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 136 | 93.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sengoku_chihiro_sakurasounopetnakanojo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 71 | 64.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sengoku_chihiro_sakurasounopetnakanojo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 136 | 115.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sengoku_chihiro_sakurasounopetnakanojo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sengoku_chihiro_sakurasounopetnakanojo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, anime_coloring, red_eyes, solo, blurry, collarbone, smile, white_shirt |
| 1 | 12 |  |  |  |  |  | necklace, 1girl, formal, solo, suit, anime_coloring, jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | anime_coloring | red_eyes | solo | blurry | collarbone | smile | white_shirt | necklace | formal | suit | jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------|:-------|:---------|:-------------|:--------|:--------------|:-----------|:---------|:-------|:---------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | |
| 1 | 12 |  |  |  |  |  | X | X | | X | | | | | X | X | X | X |
|
Rewcifer/best_outputs_selected_50_3model | ---
dataset_info:
features:
- name: true_findings
dtype: string
- name: generated_texts_1
dtype: string
- name: generated_texts_2
dtype: string
- name: generated_texts_3
dtype: string
splits:
- name: train
num_bytes: 108090.4181184669
num_examples: 50
download_size: 83094
dataset_size: 108090.4181184669
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "best_outputs_selected_50_3model"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chupil96/www | ---
license: apache-2.0
---
|
zolak/twitter_dataset_80_1713176222 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 319151
num_examples: 817
download_size: 163688
dataset_size: 319151
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kye/all-lucidrain-code-python-tokenized | ---
license: mit
---
|
ehealth_kd | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- es
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: eHealth-KD
tags:
- relation-prediction
dataset_info:
features:
- name: sentence
dtype: string
- name: entities
list:
- name: ent_id
dtype: string
- name: ent_text
dtype: string
- name: ent_label
dtype:
class_label:
names:
'0': Concept
'1': Action
'2': Predicate
'3': Reference
- name: start_character
dtype: int32
- name: end_character
dtype: int32
- name: relations
list:
- name: rel_id
dtype: string
- name: rel_label
dtype:
class_label:
names:
'0': is-a
'1': same-as
'2': has-property
'3': part-of
'4': causes
'5': entails
'6': in-time
'7': in-place
'8': in-context
'9': subject
'10': target
'11': domain
'12': arg
- name: arg1
dtype: string
- name: arg2
dtype: string
config_name: ehealth_kd
splits:
- name: train
num_bytes: 425713
num_examples: 800
- name: validation
num_bytes: 108154
num_examples: 199
- name: test
num_bytes: 47314
num_examples: 100
download_size: 565900
dataset_size: 581181
---
# Dataset Card for eHealth-KD
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [eHealth-KD homepage](https://knowledge-learning.github.io/ehealthkd-2020/)
- **Repository:** [eHealth-KD repository](https://github.com/knowledge-learning/ehealthkd-2020)
- **Paper:** [eHealth-KD overview paper](http://ceur-ws.org/Vol-2664/eHealth-KD_overview.pdf)
- **Leaderboard:** [eHealth-KD Challenge 2020 official results](https://knowledge-learning.github.io/ehealthkd-2020/results)
- **Point of Contact:** [Yoan Gutiérrez Vázquez](mailto:ygutierrez@dlsi.ua.es) (Organization Committee), [María Grandury](mailto:yacine@huggingface.co) (Dataset Submitter)
### Dataset Summary
Dataset of the eHealth-KD Challenge at IberLEF 2020. It is designed for the identification of semantic
entities and relations in Spanish health documents.
### Supported Tasks and Leaderboards
The eHealth-KD challenge proposes two computational subtasks:
- `named-entity-recognition`: Given a sentence of an eHealth document written in Spanish, the goal of this subtask is to
identify all the entities and their types.
- `relation-prediction`: The purpose of this subtask is to recognise all relevant semantic relationships between the entities recognised.
For an analysis of the most successful approaches of this challenge, read the [eHealth-KD overview paper](http://ceur-ws.org/Vol-2664/eHealth-KD_overview.pdf).
### Languages
The text in the dataset is in Spanish (BCP-47 code: `es`).
## Dataset Structure
### Data Instances
The first example of the eHeatlh-KD Corpus train set looks as follows:
```
{
'sentence': 'En la leucemia linfocítica crónica, hay demasiados linfocitos, un tipo de glóbulos blancos.',
'entities': {
[
'ent_id: 'T1',
'ent_text': 'leucemia linfocítica crónica',
'ent_label': 0,
'start_character': 6,
'end_character': 34
],
[
'ent_id: 'T2',
'ent_text': 'linfocitos',
'ent_label': 0,
'start_character': 51,
'end_character': 61
],
[
'ent_id: 'T3',
'ent_text': 'glóbulos blancos',
'ent_label': 0,
'start_character': 74,
'end_character': 90
]
},
relations: {
[
'rel_id: 'R0'
'rel_label': 0,
'arg1': T2
'arg2': T3
],
[
'rel_id': 'R1'
'rel_label': 5,
'arg1': T1,
'arg2': T2
]
}
}
```
### Data Fields
- `sentence`: sentence of an eHealth document written in Spanish
- `entities`: list of entities identified in the sentence
- `ent_id`: entity identifier (`T`+ a number)
- `ent_text`: entity, can consist of one or more complete words (i.e., not a prefix or a suffix of a word), and will
never include any surrounding punctuation symbols, parenthesis, etc.
- `ent_label`: type of entity (`Concept`, `Action`, `Predicate` or `Reference`)
- `start_character`: position of the first character of the entity
- `end_character`: position of the last character of the entity
- `relations`: list of semantic relationships between the entities recognised
- `rel_id`: relation identifier (`R` + a number)
- `rel_label`: type of relation, can be a general relation (`is-a`, `same-as`, `has-property`, `part-of`, `causes`, `entails`),
a contextual relation (`in-time`, `in-place`, `in-context`) an action role (`subject`, `target`) or a predicate role (`domain`, `arg`).
- `arg1`: ID of the first entity of the relation
- `arg2`: ID of the second entity of the relation
For more information about the types of entities and relations, click [here](https://knowledge-learning.github.io/ehealthkd-2020/tasks).
### Data Splits
The data is split into a training, validation and test set. The split sizes are as follow:
| | Train | Val | Test |
| ----- | ------ | ----- | ---- |
| eHealth-KD 2020 | 800 | 199 | 100 |
In the challenge there are 4 different scenarios for testing. The test data of this dataset corresponds to the third scenario.
More information about the testing data [here](https://github.com/knowledge-learning/ehealthkd-2020/tree/master/data/testing).
## Dataset Creation
### Curation Rationale
The vast amount of clinical text available online has motivated the development of automatic
knowledge discovery systems that can analyse this data and discover relevant facts.
The eHealth Knowledge Discovery (eHealth-KD) challenge, in its third edition, leverages
a semantic model of human language that encodes the most common expressions of factual
knowledge, via a set of four general-purpose entity types and thirteen semantic relations among
them. The challenge proposes the design of systems that can automatically annotate entities and
relations in clinical text in the Spanish language.
### Source Data
#### Initial Data Collection and Normalization
As in the previous edition, the corpus for eHealth-KD 2020 has been extracted from MedlinePlus sources. This platform
freely provides large health textual data from which we have made a selection for constituting the eHealth-KD corpus.
The selection has been made by sampling specific XML files from the collection available in the [Medline website](https://medlineplus.gov/xml.html).
```
“MedlinePlus is the National Institutes of Health’s Website for patients and their families and
friends. Produced by the National Library of Medicine, the world’s largest medical library, it
brings you information about diseases, conditions, and wellness issues in language you can
understand. MedlinePlus offers reliable, up-to-date health information, anytime, anywhere, for free.”
```
These files contain several entries related to health and medicine topics and have been processed to remove all
XML markup to extract the textual content. Only Spanish language items were considered. Once cleaned, each individual
item was converted to a plain text document, and some further post-processing is applied to remove unwanted sentences,
such as headers, footers and similar elements, and to flatten HTML lists into plain sentences.
#### Who are the source language producers?
As in the previous edition, the corpus for eHealth-KD 2020 was extracted from [MedlinePlus](https://medlineplus.gov/xml.html) sources.
### Annotations
#### Annotation process
Once the MedlinePlus files were cleaned, they were manually tagged using [BRAT](http://brat.nlplab.org/) by a group of
annotators. After tagging, a post-processing was applied to BRAT’s output files (ANN format) to obtain the output files
in the formats needed for the challenge.
#### Who are the annotators?
The data was manually tagged.
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
"The eHealth-KD 2020 proposes –as the previous editions– modeling the human language in a scenario in which Spanish
electronic health documents could be machine-readable from a semantic point of view.
With this task, we expect to encourage the development of software technologies to automatically extract a large variety
of knowledge from eHealth documents written in the Spanish Language."
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
Dataset provided for research purposes only. Please check dataset license for additional information.
## Additional Information
### Dataset Curators
#### Organization Committee
| Name | Email | Institution |
|:---------------------------------------:|:---------------------:|:-----------------------------:|
| Yoan Gutiérrez Vázquez (contact person) | ygutierrez@dlsi.ua.es | University of Alicante, Spain |
| Suilan Estévez Velarde | sestevez@matcom.uh.cu | University of Havana, Cuba |
| Alejandro Piad Morffis | apiad@matcom.uh.cu | University of Havana, Cuba |
| Yudivián Almeida Cruz | yudy@matcom.uh.cu | University of Havana, Cuba |
| Andrés Montoyo Guijarro | montoyo@dlsi.ua.es | University of Alicante, Spain |
| Rafael Muñoz Guillena | rafael@dlsi.ua.es | University of Alicante, Spain |
#### Funding
This research has been supported by a Carolina Foundation grant in agreement with University of Alicante and University
of Havana. Moreover, it has also been partially funded by both aforementioned universities, IUII, Generalitat Valenciana,
Spanish Government, Ministerio de Educación, Cultura y Deporte through the projects SIIA (PROMETEU/2018/089) and
LIVINGLANG (RTI2018-094653-B-C22).
### Licensing Information
This dataset is under the Attribution-NonCommercial-ShareAlike 4.0 International
[(CC BY-NC-SA 4.0)](https://creativecommons.org/licenses/by-nc-sa/4.0/).
To accept the distribution terms, please fill in the following [form](https://forms.gle/pUJutSDq2FYLwNWQA).
### Citation Information
In the following link you can find the
[preliminar bibtexts of the systems’ working-notes](https://knowledge-learning.github.io/ehealthkd-2020/shared/eHealth-KD_2020_bibtexts.zip).
In addition, to cite the eHealth-KD challenge you can use the following preliminar bibtext:
```
@inproceedings{overview_ehealthkd2020,
author = {Piad{-}Morffis, Alejandro and
Guti{\'{e}}rrez, Yoan and
Ca{\~{n}}izares-Diaz, Hian and
Estevez{-}Velarde, Suilan and
Almeida{-}Cruz, Yudivi{\'{a}}n and
Mu{\~{n}}oz, Rafael and
Montoyo, Andr{\'{e}}s},
title = {Overview of the eHealth Knowledge Discovery Challenge at IberLEF 2020},
booktitle = ,
year = {2020},
}
```
### Contributions
Thanks to [@mariagrandury](https://github.com/mariagrandury) for adding this dataset. |
zolak/twitter_dataset_79_1713178575 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 317814
num_examples: 758
download_size: 160101
dataset_size: 317814
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/synpre_set_1M_token_1000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 1193269554
num_examples: 1000000
- name: validation
num_bytes: 11956441
num_examples: 10000
download_size: 598117661
dataset_size: 1205225995
---
# Dataset Card for "synpre_set_1M_token_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/VQAv2_sample_validation_facebook_opt_350m_VQAv2_visclues_ns_100 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_128
num_bytes: 2548987
num_examples: 100
download_size: 462946
dataset_size: 2548987
---
# Dataset Card for "VQAv2_sample_validation_facebook_opt_350m_VQAv2_visclues_ns_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.