datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_teknium__OpenHermes-7B | ---
pretty_name: Evaluation run of teknium/OpenHermes-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [teknium/OpenHermes-7B](https://huggingface.co/teknium/OpenHermes-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_teknium__OpenHermes-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T05:03:25.636029](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-7B/blob/main/results_2023-10-26T05-03-25.636029.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2645763422818792,\n\
\ \"em_stderr\": 0.004517352215857921,\n \"f1\": 0.33702810402684713,\n\
\ \"f1_stderr\": 0.004480224621998652,\n \"acc\": 0.3975524975571051,\n\
\ \"acc_stderr\": 0.009127124661977076\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2645763422818792,\n \"em_stderr\": 0.004517352215857921,\n\
\ \"f1\": 0.33702810402684713,\n \"f1_stderr\": 0.004480224621998652\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.050037907505686124,\n \
\ \"acc_stderr\": 0.006005442354577731\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n\
\ }\n}\n```"
repo_url: https://huggingface.co/teknium/OpenHermes-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T05_03_25.636029
path:
- '**/details_harness|drop|3_2023-10-26T05-03-25.636029.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T05-03-25.636029.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T05_03_25.636029
path:
- '**/details_harness|gsm8k|5_2023-10-26T05-03-25.636029.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T05-03-25.636029.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-09-00.502210.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-09-00.502210.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-09-00.502210.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T05_03_25.636029
path:
- '**/details_harness|winogrande|5_2023-10-26T05-03-25.636029.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T05-03-25.636029.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_09_00.502210
path:
- results_2023-09-18T14-09-00.502210.parquet
- split: 2023_10_26T05_03_25.636029
path:
- results_2023-10-26T05-03-25.636029.parquet
- split: latest
path:
- results_2023-10-26T05-03-25.636029.parquet
---
# Dataset Card for Evaluation run of teknium/OpenHermes-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/teknium/OpenHermes-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [teknium/OpenHermes-7B](https://huggingface.co/teknium/OpenHermes-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_teknium__OpenHermes-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T05:03:25.636029](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-7B/blob/main/results_2023-10-26T05-03-25.636029.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2645763422818792,
"em_stderr": 0.004517352215857921,
"f1": 0.33702810402684713,
"f1_stderr": 0.004480224621998652,
"acc": 0.3975524975571051,
"acc_stderr": 0.009127124661977076
},
"harness|drop|3": {
"em": 0.2645763422818792,
"em_stderr": 0.004517352215857921,
"f1": 0.33702810402684713,
"f1_stderr": 0.004480224621998652
},
"harness|gsm8k|5": {
"acc": 0.050037907505686124,
"acc_stderr": 0.006005442354577731
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.012248806969376422
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_julianweng__Llama-2-7b-chat-orcah | ---
pretty_name: Evaluation run of julianweng/Llama-2-7b-chat-orcah
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [julianweng/Llama-2-7b-chat-orcah](https://huggingface.co/julianweng/Llama-2-7b-chat-orcah)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_julianweng__Llama-2-7b-chat-orcah\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T17:33:03.536328](https://huggingface.co/datasets/open-llm-leaderboard/details_julianweng__Llama-2-7b-chat-orcah/blob/main/results_2023-09-17T17-33-03.536328.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.02936241610738255,\n\
\ \"em_stderr\": 0.0017288770032803159,\n \"f1\": 0.07552432885906037,\n\
\ \"f1_stderr\": 0.0020587215501161925,\n \"acc\": 0.3737288120380116,\n\
\ \"acc_stderr\": 0.00900957367793152\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.02936241610738255,\n \"em_stderr\": 0.0017288770032803159,\n\
\ \"f1\": 0.07552432885906037,\n \"f1_stderr\": 0.0020587215501161925\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03790750568612585,\n \
\ \"acc_stderr\": 0.005260333907798431\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7095501183898973,\n \"acc_stderr\": 0.01275881344806461\n\
\ }\n}\n```"
repo_url: https://huggingface.co/julianweng/Llama-2-7b-chat-orcah
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|arc:challenge|25_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T17_33_03.536328
path:
- '**/details_harness|drop|3_2023-09-17T17-33-03.536328.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T17-33-03.536328.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T17_33_03.536328
path:
- '**/details_harness|gsm8k|5_2023-09-17T17-33-03.536328.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T17-33-03.536328.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hellaswag|10_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:44:40.236710.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T11:44:40.236710.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T11:44:40.236710.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T17_33_03.536328
path:
- '**/details_harness|winogrande|5_2023-09-17T17-33-03.536328.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T17-33-03.536328.parquet'
- config_name: results
data_files:
- split: 2023_07_24T11_44_40.236710
path:
- results_2023-07-24T11:44:40.236710.parquet
- split: 2023_09_17T17_33_03.536328
path:
- results_2023-09-17T17-33-03.536328.parquet
- split: latest
path:
- results_2023-09-17T17-33-03.536328.parquet
---
# Dataset Card for Evaluation run of julianweng/Llama-2-7b-chat-orcah
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/julianweng/Llama-2-7b-chat-orcah
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [julianweng/Llama-2-7b-chat-orcah](https://huggingface.co/julianweng/Llama-2-7b-chat-orcah) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_julianweng__Llama-2-7b-chat-orcah",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T17:33:03.536328](https://huggingface.co/datasets/open-llm-leaderboard/details_julianweng__Llama-2-7b-chat-orcah/blob/main/results_2023-09-17T17-33-03.536328.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.02936241610738255,
"em_stderr": 0.0017288770032803159,
"f1": 0.07552432885906037,
"f1_stderr": 0.0020587215501161925,
"acc": 0.3737288120380116,
"acc_stderr": 0.00900957367793152
},
"harness|drop|3": {
"em": 0.02936241610738255,
"em_stderr": 0.0017288770032803159,
"f1": 0.07552432885906037,
"f1_stderr": 0.0020587215501161925
},
"harness|gsm8k|5": {
"acc": 0.03790750568612585,
"acc_stderr": 0.005260333907798431
},
"harness|winogrande|5": {
"acc": 0.7095501183898973,
"acc_stderr": 0.01275881344806461
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
vietgpt/open-web-math | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: url
dtype: string
- name: text
dtype: string
- name: date
dtype: string
- name: metadata
dtype: string
splits:
- name: train
num_bytes: 56651995057
num_examples: 6315233
download_size: 27428876767
dataset_size: 56651995057
---
# Dataset Card for "open-web-math"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alexator26/1200_second_face_stickers_cleared | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: cartoonized_image
dtype: image
splits:
- name: train
num_bytes: 115526041.0
num_examples: 186
download_size: 115527421
dataset_size: 115526041.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Vernei/cargos | ---
license: afl-3.0
---
|
open-llm-leaderboard/details_xxyyy123__Mistral-dpo-v1 | ---
pretty_name: Evaluation run of xxyyy123/Mistral-dpo-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xxyyy123/Mistral-dpo-v1](https://huggingface.co/xxyyy123/Mistral-dpo-v1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__Mistral-dpo-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T15:21:55.337757](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__Mistral-dpo-v1/blob/main/results_2023-12-09T15-21-55.337757.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6327450857688259,\n\
\ \"acc_stderr\": 0.03239847501378947,\n \"acc_norm\": 0.6369776561937077,\n\
\ \"acc_norm_stderr\": 0.03304786152616907,\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.50494275538215,\n\
\ \"mc2_stderr\": 0.015065297117078024\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5972696245733788,\n \"acc_stderr\": 0.014332236306790147,\n\
\ \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.014070265519268802\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6350328619796853,\n\
\ \"acc_stderr\": 0.00480437056385622,\n \"acc_norm\": 0.8358892650866361,\n\
\ \"acc_norm_stderr\": 0.0036961908325474184\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383886,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383886\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851112,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851112\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895518,\n \"\
acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010354,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010354\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474086,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474086\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597518,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597518\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3307262569832402,\n\
\ \"acc_stderr\": 0.01573502625896612,\n \"acc_norm\": 0.3307262569832402,\n\
\ \"acc_norm_stderr\": 0.01573502625896612\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729147,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4302477183833116,\n\
\ \"acc_stderr\": 0.012645361435115233,\n \"acc_norm\": 0.4302477183833116,\n\
\ \"acc_norm_stderr\": 0.012645361435115233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.50494275538215,\n\
\ \"mc2_stderr\": 0.015065297117078024\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235802\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4609552691432904,\n \
\ \"acc_stderr\": 0.013730428449116337\n }\n}\n```"
repo_url: https://huggingface.co/xxyyy123/Mistral-dpo-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|arc:challenge|25_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|gsm8k|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hellaswag|10_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-21-55.337757.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T15-21-55.337757.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- '**/details_harness|winogrande|5_2023-12-09T15-21-55.337757.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T15-21-55.337757.parquet'
- config_name: results
data_files:
- split: 2023_12_09T15_21_55.337757
path:
- results_2023-12-09T15-21-55.337757.parquet
- split: latest
path:
- results_2023-12-09T15-21-55.337757.parquet
---
# Dataset Card for Evaluation run of xxyyy123/Mistral-dpo-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xxyyy123/Mistral-dpo-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xxyyy123/Mistral-dpo-v1](https://huggingface.co/xxyyy123/Mistral-dpo-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xxyyy123__Mistral-dpo-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T15:21:55.337757](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__Mistral-dpo-v1/blob/main/results_2023-12-09T15-21-55.337757.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6327450857688259,
"acc_stderr": 0.03239847501378947,
"acc_norm": 0.6369776561937077,
"acc_norm_stderr": 0.03304786152616907,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.50494275538215,
"mc2_stderr": 0.015065297117078024
},
"harness|arc:challenge|25": {
"acc": 0.5972696245733788,
"acc_stderr": 0.014332236306790147,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.014070265519268802
},
"harness|hellaswag|10": {
"acc": 0.6350328619796853,
"acc_stderr": 0.00480437056385622,
"acc_norm": 0.8358892650866361,
"acc_norm_stderr": 0.0036961908325474184
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383886,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383886
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851112,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851112
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895518,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010354,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010354
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474086,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474086
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597518,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597518
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3307262569832402,
"acc_stderr": 0.01573502625896612,
"acc_norm": 0.3307262569832402,
"acc_norm_stderr": 0.01573502625896612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729147,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4302477183833116,
"acc_stderr": 0.012645361435115233,
"acc_norm": 0.4302477183833116,
"acc_norm_stderr": 0.012645361435115233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.50494275538215,
"mc2_stderr": 0.015065297117078024
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235802
},
"harness|gsm8k|5": {
"acc": 0.4609552691432904,
"acc_stderr": 0.013730428449116337
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
satwikapaul/retinaldisease | ---
license: unknown
---
|
erfanloghmani/myket-android-application-recommendation-dataset | ---
license: mit
task_categories:
- graph-ml
size_categories:
- 100K<n<1M
configs:
- config_name: main_data
data_files: "myket.csv"
- config_name: package_name_features
data_files: "app_info.csv"
---
# Myket Android Application Install Dataset
This dataset contains information on application install interactions of users in the [Myket](https://myket.ir/) android application market. The dataset was created for the purpose of evaluating interaction prediction models, requiring user and item identifiers along with timestamps of the interactions.
## Data Creation
The dataset was initially generated by the Myket data team, and later cleaned and subsampled by Erfan Loghmani a master student at Sharif University of Technology at the time. The data team focused on a two-week period and randomly sampled 1/3 of the users with interactions during that period. They then selected install and update interactions for three months before and after the two-week period, resulting in interactions spanning about 6 months and two weeks.
We further subsampled and cleaned the data to focus on application download interactions. We identified the top 8000 most installed applications and selected interactions related to them. We retained users with more than 32 interactions, resulting in 280,391 users. From this group, we randomly selected 10,000 users, and the data was filtered to include only interactions for these users. The detailed procedure can be found in [here](https://github.com/erfanloghmani/myket-android-application-market-dataset/blob/main/create_data.ipynb).
## Data Structure
The dataset has two main files.
- `myket.csv`: This file contains the interaction information and follows the same format as the datasets used in the "[JODIE: Predicting Dynamic Embedding Trajectory in Temporal Interaction Networks](https://github.com/claws-lab/jodie)" (ACM SIGKDD 2019) project. However, this data does not contain state labels and interaction features, resulting in associated columns being all zero.
- `app_info_sample.csv`: This file comprises features associated with applications present in the sample. For each individual application, information such as the approximate number of installs, average rating, count of ratings, and category are included. These features provide insights into the applications present in the dataset.
## Dataset Details
- Total Instances: 694,121 install interaction instances
- Instances Format: Triplets of user_id, app_name, timestamp
- 10,000 users and 7,988 android applications
For a detailed summary of the data's statistics, including information on users, applications, and interactions, please refer to the Python notebook available at [summary-stats.ipynb](https://github.com/erfanloghmani/myket-android-application-market-dataset/blob/main/summary-stats.ipynb). The notebook provides an overview of the dataset's characteristics and can be helpful for understanding the data's structure before using it for research or analysis.
### Top 20 Most Installed Applications
| Package Name | Count of Interactions |
| ---------------------------------- | --------------------- |
| com.instagram.android | 15292 |
| ir.resaneh1.iptv | 12143 |
| com.tencent.ig | 7919 |
| com.ForgeGames.SpecialForcesGroup2 | 7797 |
| ir.nomogame.ClutchGame | 6193 |
| com.dts.freefireth | 6041 |
| com.whatsapp | 5876 |
| com.supercell.clashofclans | 5817 |
| com.mojang.minecraftpe | 5649 |
| com.lenovo.anyshare.gps | 5076 |
| ir.medu.shad | 4673 |
| com.firsttouchgames.dls3 | 4641 |
| com.activision.callofduty.shooter | 4357 |
| com.tencent.iglite | 4126 |
| com.aparat | 3598 |
| com.kiloo.subwaysurf | 3135 |
| com.supercell.clashroyale | 2793 |
| co.palang.QuizOfKings | 2589 |
| com.nazdika.app | 2436 |
| com.digikala | 2413 |
## Comparison with SNAP Datasets
The Myket dataset introduced in this repository exhibits distinct characteristics compared to the real-world datasets used by the project. The table below provides a comparative overview of the key dataset characteristics:
| Dataset | #Users | #Items | #Interactions | Average Interactions per User | Average Unique Items per User |
| --------- | ----------------- | ----------------- | ----------------- | ----------------------------- | ----------------------------- |
| **Myket** | **10,000** | **7,988** | 694,121 | 69.4 | 54.6 |
| LastFM | 980 | 1,000 | 1,293,103 | 1,319.5 | 158.2 |
| Reddit | **10,000** | 984 | 672,447 | 67.2 | 7.9 |
| Wikipedia | 8,227 | 1,000 | 157,474 | 19.1 | 2.2 |
| MOOC | 7,047 | 97 | 411,749 | 58.4 | 25.3 |
The Myket dataset stands out by having an ample number of both users and items, highlighting its relevance for real-world, large-scale applications. Unlike LastFM, Reddit, and Wikipedia datasets, where users exhibit repetitive item interactions, the Myket dataset contains a comparatively lower amount of repetitive interactions. This unique characteristic reflects the diverse nature of user behaviors in the Android application market environment.
## Citation
If you use this dataset in your research, please cite the following [preprint](https://arxiv.org/abs/2308.06862):
```
@misc{loghmani2023effect,
title={Effect of Choosing Loss Function when Using T-batching for Representation Learning on Dynamic Networks},
author={Erfan Loghmani and MohammadAmin Fazli},
year={2023},
eprint={2308.06862},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
|
batubayk/HU-News | ---
task_categories:
- summarization
- text-classification
- text-generation
- text2text-generation
language:
- hu
pretty_name: HU-News
size_categories:
- 100K<n<1M
---
# Citation
If you use the dataset, please cite the paper:
@article{10.1007/s10579-021-09568-y,
year = {2022},
title = {{Abstractive text summarization and new large-scale datasets for agglutinative languages Turkish and Hungarian}},
author = {Baykara, Batuhan and Güngör, Tunga},
journal = {Language Resources and Evaluation},
issn = {1574-020X},
doi = {10.1007/s10579-021-09568-y},
pages = {1--35}} |
heliosprime/twitter_dataset_1712965782 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 6809
num_examples: 15
download_size: 8149
dataset_size: 6809
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712965782"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
danielaivanova/damaged-media | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: annotation
dtype: image
- name: annotation_rgb
dtype: image
- name: material
dtype: string
- name: content
dtype: string
- name: type
dtype: string
- name: damage_description
dtype: string
- name: llava_description
dtype: string
- name: verified_description
dtype: string
splits:
- name: train
num_bytes: 13549689167.0
num_examples: 418
download_size: 4071052269
dataset_size: 13549689167.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "damage-analogue-media"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tttfff/test1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: package_name
dtype: string
- name: review
dtype: string
- name: date
dtype: string
- name: star
dtype: int64
- name: version_id
dtype: int64
splits:
- name: train
num_bytes: 1508
num_examples: 5
- name: test
num_bytes: 956
num_examples: 5
download_size: 9451
dataset_size: 2464
---
# Dataset Card for "test1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SeoyeonChoi/store0312 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1917
num_examples: 18
download_size: 1885
dataset_size: 1917
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ppxscal/academic_embeddings_cosimrank_bfs | ---
dataset_info:
features:
- name: Query Text
dtype: string
- name: Ranking 1
dtype: string
- name: Ranking 2
dtype: string
- name: Ranking 3
dtype: string
- name: Ranking 4
dtype: string
- name: Ranking 5
dtype: string
- name: Ranking 6
dtype: string
- name: Ranking 7
dtype: string
- name: Ranking 8
dtype: string
- name: Ranking 9
dtype: string
- name: Ranking 10
dtype: string
- name: Ranking 11
dtype: string
- name: Ranking 12
dtype: string
- name: Ranking 13
dtype: string
- name: score_0
dtype: float64
- name: score_1
dtype: float64
- name: score_2
dtype: float64
- name: score_3
dtype: float64
- name: score_4
dtype: float64
- name: score_5
dtype: float64
- name: score_6
dtype: float64
- name: score_7
dtype: float64
- name: score_8
dtype: float64
- name: score_9
dtype: float64
- name: score_10
dtype: float64
- name: score_11
dtype: float64
- name: score_12
dtype: float64
- name: score_13
dtype: float64
splits:
- name: train
num_bytes: 2459889430
num_examples: 168966
download_size: 808003087
dataset_size: 2459889430
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dipteshkanojia/t5-qe-2023-engu-da-test | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: task
dtype: string
splits:
- name: train
num_bytes: 821362
num_examples: 1075
download_size: 269524
dataset_size: 821362
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
- gu
---
# Dataset Card for "t5-qe-2023-engu-da-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jq/sunbird_speech | ---
dataset_info:
- config_name: ach
features:
- name: ids
dtype: string
- name: texts
dtype: string
- name: audios
sequence: float32
- name: audio_languages
dtype: string
- name: are_studio
dtype: bool
- name: speaker_ids
dtype: string
- name: sample_rates
dtype: int64
splits:
- name: train
num_bytes: 3820274
num_examples: 10
- name: dev
num_bytes: 3580883
num_examples: 10
- name: test
num_bytes: 3742241
num_examples: 10
download_size: 5375242
dataset_size: 11143398
- config_name: lug
features:
- name: ids
dtype: string
- name: texts
dtype: string
- name: audios
sequence: float32
- name: audio_languages
dtype: string
- name: are_studio
dtype: bool
- name: speaker_ids
dtype: string
- name: sample_rates
dtype: int64
splits:
- name: train
num_bytes: 5196706
num_examples: 10
- name: dev
num_bytes: 3370989
num_examples: 10
- name: test
num_bytes: 2899936
num_examples: 10
download_size: 5732034
dataset_size: 11467631
configs:
- config_name: ach
data_files:
- split: train
path: ach/train-*
- split: dev
path: ach/dev-*
- split: test
path: ach/test-*
- config_name: lug
data_files:
- split: train
path: lug/train-*
- split: dev
path: lug/dev-*
- split: test
path: lug/test-*
---
|
chagasclone/agrecivo | ---
license: openrail
---
|
kartashoffv/vedomosti_articles | ---
license: mit
---
|
CyberHarem/kuroshio_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kuroshio/黒潮/黑潮 (Azur Lane)
This is the dataset of kuroshio/黒潮/黑潮 (Azur Lane), containing 10 images and their tags.
The core tags of this character are `braid, horns, red_eyes, hair_flower, hair_ornament, long_hair, twin_braids, bangs, pointy_ears, black_hair, bow, red_bow, hair_bow, sidelocks, red_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 10.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 6.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 21 | 11.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 9.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 21 | 16.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kuroshio_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, looking_at_viewer, black_scarf, pleated_skirt, red_thighhighs, bare_shoulders, black_skirt, obi, white_background, bridal_gauntlets, elbow_gloves, panties, simple_background, garter_straps, weapon, blush, closed_mouth, floral_print, full_body, kimono, pink_flower, shoes, white_footwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | black_scarf | pleated_skirt | red_thighhighs | bare_shoulders | black_skirt | obi | white_background | bridal_gauntlets | elbow_gloves | panties | simple_background | garter_straps | weapon | blush | closed_mouth | floral_print | full_body | kimono | pink_flower | shoes | white_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------|:----------------|:-----------------|:-----------------|:--------------|:------|:-------------------|:-------------------|:---------------|:----------|:--------------------|:----------------|:---------|:--------|:---------------|:---------------|:------------|:---------|:--------------|:--------|:-----------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
TheusTW/isa | ---
license: openrail
---
|
hongerzh/my-nft-prompt-and-sale-label | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 5747469808.67
num_examples: 29339
- name: validation
num_bytes: 1910439936.185
num_examples: 9777
- name: test
num_bytes: 2129410854.38
num_examples: 9780
download_size: 9022431797
dataset_size: 9787320599.235
---
# Dataset Card for "my-nft-prompt-and-sale-label"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.3 | ---
pretty_name: Evaluation run of jondurbin/airoboros-65b-gpt4-1.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-65b-gpt4-1.3](https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T00:22:24.283273](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.3/blob/main/results_2023-10-19T00-22-24.283273.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.40635486577181207,\n\
\ \"em_stderr\": 0.00502985933530148,\n \"f1\": 0.49071728187919794,\n\
\ \"f1_stderr\": 0.0047528105237378505,\n \"acc\": 0.4679967304402357,\n\
\ \"acc_stderr\": 0.010353850140010314\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.40635486577181207,\n \"em_stderr\": 0.00502985933530148,\n\
\ \"f1\": 0.49071728187919794,\n \"f1_stderr\": 0.0047528105237378505\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13646702047005307,\n \
\ \"acc_stderr\": 0.00945574199881554\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7995264404104183,\n \"acc_stderr\": 0.011251958281205085\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|arc:challenge|25_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T00_22_24.283273
path:
- '**/details_harness|drop|3_2023-10-19T00-22-24.283273.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T00-22-24.283273.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T00_22_24.283273
path:
- '**/details_harness|gsm8k|5_2023-10-19T00-22-24.283273.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T00-22-24.283273.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hellaswag|10_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:21:18.857678.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T14:21:18.857678.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T14:21:18.857678.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T00_22_24.283273
path:
- '**/details_harness|winogrande|5_2023-10-19T00-22-24.283273.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T00-22-24.283273.parquet'
- config_name: results
data_files:
- split: 2023_08_09T14_21_18.857678
path:
- results_2023-08-09T14:21:18.857678.parquet
- split: 2023_10_19T00_22_24.283273
path:
- results_2023-10-19T00-22-24.283273.parquet
- split: latest
path:
- results_2023-10-19T00-22-24.283273.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-65b-gpt4-1.3](https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T00:22:24.283273](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.3/blob/main/results_2023-10-19T00-22-24.283273.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.40635486577181207,
"em_stderr": 0.00502985933530148,
"f1": 0.49071728187919794,
"f1_stderr": 0.0047528105237378505,
"acc": 0.4679967304402357,
"acc_stderr": 0.010353850140010314
},
"harness|drop|3": {
"em": 0.40635486577181207,
"em_stderr": 0.00502985933530148,
"f1": 0.49071728187919794,
"f1_stderr": 0.0047528105237378505
},
"harness|gsm8k|5": {
"acc": 0.13646702047005307,
"acc_stderr": 0.00945574199881554
},
"harness|winogrande|5": {
"acc": 0.7995264404104183,
"acc_stderr": 0.011251958281205085
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Ujjwal123/Crossword-Generator | ---
license: mit
---
|
VatsaDev/robofunctions | ---
license: mit
---
Part of a dataset to convert text to functions for a robot
this is just raw possible instructions, the text versions have not been made yet, about 2 million raw functions, 10K converted to words |
tyzhu/find_last_sent_train_50_eval_10_hint5 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 135976
num_examples: 110
- name: validation
num_bytes: 9357
num_examples: 10
download_size: 82351
dataset_size: 145333
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_last_sent_train_50_eval_10_hint5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MohammedNasri/Denoised_data_jason2 | ---
dataset_info:
features:
- name: data
struct:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: 'null'
- name: sampling_rate
dtype: int64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 1127259888
num_examples: 2000
download_size: 278526142
dataset_size: 1127259888
---
# Dataset Card for "Denoised_data_jason2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Sao10K__BrainDerp3 | ---
pretty_name: Evaluation run of Sao10K/BrainDerp3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/BrainDerp3](https://huggingface.co/Sao10K/BrainDerp3) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__BrainDerp3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T22:57:20.816050](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__BrainDerp3/blob/main/results_2023-10-28T22-57-20.816050.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0269505033557047,\n\
\ \"em_stderr\": 0.0016584048452624436,\n \"f1\": 0.1492428691275163,\n\
\ \"f1_stderr\": 0.002525870073512654,\n \"acc\": 0.4182403617100085,\n\
\ \"acc_stderr\": 0.009778590926073638\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0269505033557047,\n \"em_stderr\": 0.0016584048452624436,\n\
\ \"f1\": 0.1492428691275163,\n \"f1_stderr\": 0.002525870073512654\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0803639120545868,\n \
\ \"acc_stderr\": 0.007488258573239077\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.012068923278908199\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Sao10K/BrainDerp3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|arc:challenge|25_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T22_57_20.816050
path:
- '**/details_harness|drop|3_2023-10-28T22-57-20.816050.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T22-57-20.816050.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T22_57_20.816050
path:
- '**/details_harness|gsm8k|5_2023-10-28T22-57-20.816050.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T22-57-20.816050.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hellaswag|10_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-05.088946.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T07-48-05.088946.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T07-48-05.088946.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T22_57_20.816050
path:
- '**/details_harness|winogrande|5_2023-10-28T22-57-20.816050.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T22-57-20.816050.parquet'
- config_name: results
data_files:
- split: 2023_10_04T07_48_05.088946
path:
- results_2023-10-04T07-48-05.088946.parquet
- split: 2023_10_28T22_57_20.816050
path:
- results_2023-10-28T22-57-20.816050.parquet
- split: latest
path:
- results_2023-10-28T22-57-20.816050.parquet
---
# Dataset Card for Evaluation run of Sao10K/BrainDerp3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/BrainDerp3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/BrainDerp3](https://huggingface.co/Sao10K/BrainDerp3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__BrainDerp3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T22:57:20.816050](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__BrainDerp3/blob/main/results_2023-10-28T22-57-20.816050.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0269505033557047,
"em_stderr": 0.0016584048452624436,
"f1": 0.1492428691275163,
"f1_stderr": 0.002525870073512654,
"acc": 0.4182403617100085,
"acc_stderr": 0.009778590926073638
},
"harness|drop|3": {
"em": 0.0269505033557047,
"em_stderr": 0.0016584048452624436,
"f1": 0.1492428691275163,
"f1_stderr": 0.002525870073512654
},
"harness|gsm8k|5": {
"acc": 0.0803639120545868,
"acc_stderr": 0.007488258573239077
},
"harness|winogrande|5": {
"acc": 0.7561168113654302,
"acc_stderr": 0.012068923278908199
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_rizla__raccoon-small | ---
pretty_name: Evaluation run of rizla/raccoon-small
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rizla/raccoon-small](https://huggingface.co/rizla/raccoon-small) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rizla__raccoon-small\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T10:01:35.686366](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__raccoon-small/blob/main/results_2024-02-02T10-01-35.686366.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.650595848646737,\n\
\ \"acc_stderr\": 0.03218536411440889,\n \"acc_norm\": 0.651267003548253,\n\
\ \"acc_norm_stderr\": 0.032861158588995604,\n \"mc1\": 0.6193390452876377,\n\
\ \"mc1_stderr\": 0.016997627871907915,\n \"mc2\": 0.7673830386789108,\n\
\ \"mc2_stderr\": 0.013988013317866293\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7320819112627986,\n \"acc_stderr\": 0.01294203019513643,\n\
\ \"acc_norm\": 0.7440273037542662,\n \"acc_norm_stderr\": 0.01275301324124452\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7184823740290779,\n\
\ \"acc_stderr\": 0.00448820175664258,\n \"acc_norm\": 0.8872734515036845,\n\
\ \"acc_norm_stderr\": 0.003156118964752944\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.023025899617188723,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.023025899617188723\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978093,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978093\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660836,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660836\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162112,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162112\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n\
\ \"acc_stderr\": 0.01655860163604104,\n \"acc_norm\": 0.4301675977653631,\n\
\ \"acc_norm_stderr\": 0.01655860163604104\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n\
\ \"acc_stderr\": 0.01273239828619044,\n \"acc_norm\": 0.46153846153846156,\n\
\ \"acc_norm_stderr\": 0.01273239828619044\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368043,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368043\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6193390452876377,\n\
\ \"mc1_stderr\": 0.016997627871907915,\n \"mc2\": 0.7673830386789108,\n\
\ \"mc2_stderr\": 0.013988013317866293\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8737174427782163,\n \"acc_stderr\": 0.009335559129908475\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5686125852918877,\n \
\ \"acc_stderr\": 0.013642195352511568\n }\n}\n```"
repo_url: https://huggingface.co/rizla/raccoon-small
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|arc:challenge|25_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|arc:challenge|25_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|gsm8k|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|gsm8k|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hellaswag|10_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hellaswag|10_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T09-54-15.565869.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T10-01-35.686366.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T10-01-35.686366.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- '**/details_harness|winogrande|5_2024-02-02T09-54-15.565869.parquet'
- split: 2024_02_02T10_01_35.686366
path:
- '**/details_harness|winogrande|5_2024-02-02T10-01-35.686366.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T10-01-35.686366.parquet'
- config_name: results
data_files:
- split: 2024_02_02T09_54_15.565869
path:
- results_2024-02-02T09-54-15.565869.parquet
- split: 2024_02_02T10_01_35.686366
path:
- results_2024-02-02T10-01-35.686366.parquet
- split: latest
path:
- results_2024-02-02T10-01-35.686366.parquet
---
# Dataset Card for Evaluation run of rizla/raccoon-small
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rizla/raccoon-small](https://huggingface.co/rizla/raccoon-small) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rizla__raccoon-small",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T10:01:35.686366](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__raccoon-small/blob/main/results_2024-02-02T10-01-35.686366.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.650595848646737,
"acc_stderr": 0.03218536411440889,
"acc_norm": 0.651267003548253,
"acc_norm_stderr": 0.032861158588995604,
"mc1": 0.6193390452876377,
"mc1_stderr": 0.016997627871907915,
"mc2": 0.7673830386789108,
"mc2_stderr": 0.013988013317866293
},
"harness|arc:challenge|25": {
"acc": 0.7320819112627986,
"acc_stderr": 0.01294203019513643,
"acc_norm": 0.7440273037542662,
"acc_norm_stderr": 0.01275301324124452
},
"harness|hellaswag|10": {
"acc": 0.7184823740290779,
"acc_stderr": 0.00448820175664258,
"acc_norm": 0.8872734515036845,
"acc_norm_stderr": 0.003156118964752944
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188723,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188723
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.029597329730978093,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.029597329730978093
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660836,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660836
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.01655860163604104,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.01655860163604104
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.01273239828619044,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.01273239828619044
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368043,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368043
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6193390452876377,
"mc1_stderr": 0.016997627871907915,
"mc2": 0.7673830386789108,
"mc2_stderr": 0.013988013317866293
},
"harness|winogrande|5": {
"acc": 0.8737174427782163,
"acc_stderr": 0.009335559129908475
},
"harness|gsm8k|5": {
"acc": 0.5686125852918877,
"acc_stderr": 0.013642195352511568
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_cola_you_ye | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 3857
num_examples: 57
- name: test
num_bytes: 3456
num_examples: 48
- name: train
num_bytes: 39547
num_examples: 540
download_size: 26466
dataset_size: 46860
---
# Dataset Card for "MULTI_VALUE_cola_you_ye"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lifiaresearch/StructureSegPlans | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 7496661.0
num_examples: 544
- name: test
num_bytes: 2129627.0
num_examples: 136
download_size: 6596410
dataset_size: 9626288.0
---
# Dataset Card for "StructureSegPlans"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seenka/banners-Canal_13_AR-20230629T200000-20230629T210000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: timestamp
dtype: timestamp[ms, tz=America/Argentina/Buenos_Aires]
- name: video_storage_path
dtype: string
- name: timedelta
dtype: time64[us]
- name: yolo_seenka_out
list:
- name: class
dtype: int64
- name: confidence
dtype: float64
- name: name
dtype: string
- name: xmax
dtype: float64
- name: xmin
dtype: float64
- name: ymax
dtype: float64
- name: ymin
dtype: float64
- name: yolo_filter_param
dtype: int64
- name: cropped_seenka_image
dtype: image
- name: embeddings_cropped
sequence: float32
- name: entropy
dtype: float64
- name: contrast
dtype: float64
splits:
- name: train
num_bytes: 737628224.5
num_examples: 3598
download_size: 737490665
dataset_size: 737628224.5
---
# Dataset Card for "banners-Canal_13_AR-20230629T200000-20230629T210000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thuan9889/dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4206897
num_examples: 1000
download_size: 2251102
dataset_size: 4206897
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
benayas/atis_artificial_5pct_v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 428817
num_examples: 4455
download_size: 138732
dataset_size: 428817
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
onghh0123/mini-platypus-three | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4188523
num_examples: 1007
download_size: 2249806
dataset_size: 4188523
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-25000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 658157
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
teddy-f-47/wikipedia_id_20231201 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1133503236
num_examples: 669160
download_size: 587745555
dataset_size: 1133503236
---
# Dataset Card for "wikipedia_id_20231201"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713043877 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2407590
num_examples: 7453
download_size: 1352925
dataset_size: 2407590
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/nachi_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nachi/那智/那智 (Azur Lane)
This is the dataset of nachi/那智/那智 (Azur Lane), containing 10 images and their tags.
The core tags of this character are `bangs, breasts, hat, long_hair, large_breasts, very_long_hair, black_headwear, animal_ears, bow, pink_eyes, peaked_cap, brown_hair, floating_hair, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 20.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 10.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 21 | 19.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 17.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 21 | 30.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nachi_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nachi_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, white_shirt, black_thighhighs, blue_skirt, cleavage, pink_bra, closed_mouth, collared_shirt, heart, miniskirt, nail_polish, navel, one_eye_closed, pleated_skirt, midriff, stomach, striped, white_background, blush, bra_peek, collarbone, crop_top, dress_shirt, full_body, garter_straps, simple_background, sleeves_rolled_up, tongue_out, black_footwear, breast_pocket, crossed_legs, hand_on_hip, hand_on_own_thigh, high_heels, jewelry, partially_unbuttoned, school_uniform, sitting, standing, zettai_ryouiki |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | white_shirt | black_thighhighs | blue_skirt | cleavage | pink_bra | closed_mouth | collared_shirt | heart | miniskirt | nail_polish | navel | one_eye_closed | pleated_skirt | midriff | stomach | striped | white_background | blush | bra_peek | collarbone | crop_top | dress_shirt | full_body | garter_straps | simple_background | sleeves_rolled_up | tongue_out | black_footwear | breast_pocket | crossed_legs | hand_on_hip | hand_on_own_thigh | high_heels | jewelry | partially_unbuttoned | school_uniform | sitting | standing | zettai_ryouiki |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:--------------|:-------------------|:-------------|:-----------|:-----------|:---------------|:-----------------|:--------|:------------|:--------------|:--------|:-----------------|:----------------|:----------|:----------|:----------|:-------------------|:--------|:-----------|:-------------|:-----------|:--------------|:------------|:----------------|:--------------------|:--------------------|:-------------|:-----------------|:----------------|:---------------|:--------------|:--------------------|:-------------|:----------|:-----------------------|:-----------------|:----------|:-----------|:-----------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
adityarra07/train_10000_2 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 1332786972.6379318
num_examples: 10000
- name: test
num_bytes: 26655739.452758636
num_examples: 200
download_size: 1340054284
dataset_size: 1359442712.0906904
---
# Dataset Card for "train_10000_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cs-uche/car_dealership | ---
license: apache-2.0
language:
- en
tags:
- retail
- car
pretty_name: Car Dealership
size_categories:
- 1M<n<10M
task_categories:
- feature-extraction
---
Retail Car Dealership Data
_____
Data for a car delearship. Perform EDA extract features and clean it up. Source Kaggle.
Try it out! It's primary goal is to provide an interface for users to download the dataset and try it out. |
liuyanchen1015/MULTI_VALUE_sst2_zero_degree | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 4537
num_examples: 32
- name: test
num_bytes: 9294
num_examples: 66
- name: train
num_bytes: 148096
num_examples: 1465
download_size: 78906
dataset_size: 161927
---
# Dataset Card for "MULTI_VALUE_sst2_zero_degree"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
krisrod/test_name | ---
license: llama2
---
{ "from": "human", "value": "Your name is Su Wen" } |
cuad | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- closed-domain-qa
- extractive-qa
paperswithcode_id: cuad
pretty_name: CUAD
train-eval-index:
- config: default
task: question-answering
task_id: extractive_question_answering
splits:
train_split: train
eval_split: test
col_mapping:
question: question
context: context
answers:
text: text
answer_start: answer_start
metrics:
- type: cuad
name: CUAD
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 1466037640
num_examples: 22450
- name: test
num_bytes: 198543467
num_examples: 4182
download_size: 18309308
dataset_size: 1664581107
---
# Dataset Card for CUAD
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Contract Understanding Atticus Dataset](https://www.atticusprojectai.org/cuad)
- **Repository:** [Contract Understanding Atticus Dataset](https://github.com/TheAtticusProject/cuad/)
- **Paper:** [CUAD: An Expert-Annotated NLP Dataset for Legal Contract Review](https://arxiv.org/abs/2103.06268)
- **Point of Contact:** [Atticus Project Team](info@atticusprojectai.org)
### Dataset Summary
Contract Understanding Atticus Dataset (CUAD) v1 is a corpus of more than 13,000 labels in 510 commercial legal contracts that have been manually labeled to identify 41 categories of important clauses that lawyers look for when reviewing contracts in connection with corporate transactions.
CUAD is curated and maintained by The Atticus Project, Inc. to support NLP research and development in legal contract review. Analysis of CUAD can be found at https://arxiv.org/abs/2103.06268. Code for replicating the results and the trained model can be found at https://github.com/TheAtticusProject/cuad.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The dataset contains samples in English only.
## Dataset Structure
### Data Instances
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"answers": {
"answer_start": [44],
"text": ['DISTRIBUTOR AGREEMENT']
},
"context": 'EXHIBIT 10.6\n\n DISTRIBUTOR AGREEMENT\n\n THIS DISTRIBUTOR AGREEMENT (the "Agreement") is made by and between Electric City Corp., a Delaware corporation ("Company") and Electric City of Illinois LLC ("Distributor") this 7th day of September, 1999...',
"id": "LIMEENERGYCO_09_09_1999-EX-10-DISTRIBUTOR AGREEMENT__Document Name_0",
"question": "Highlight the parts (if any) of this contract related to "Document Name" that should be reviewed by a lawyer. Details: The name of the contract",
"title": "LIMEENERGYCO_09_09_1999-EX-10-DISTRIBUTOR AGREEMENT"
}
```
### Data Fields
- `id`: a `string` feature.
- `title`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
### Data Splits
This dataset is split into train/test set. Number of samples in each set is given below:
| | Train | Test |
| ----- | ------ | ---- |
| CUAD | 22450 | 4182 |
## Dataset Creation
### Curation Rationale
A highly valuable specialized task without a public large-scale dataset is contract review, which costs humans substantial time, money, and attention. Many law firms spend approximately 50% of their time reviewing contracts (CEB, 2017). Due to the specialized training necessary to understand and interpret contracts, the billing rates for lawyers at large law firms are typically around $500-$900 per hour in the US. As a result, many transactions cost companies hundreds of thousands of dollars just so that lawyers can verify that there are no problematic obligations or requirements included in the contracts. Contract review can be a source of drudgery and, in comparison to other legal tasks, is widely considered to be especially boring.
Contract review costs also affect consumers. Since contract review costs are so prohibitive, contract review is not often performed outside corporate transactions. Small companies and individuals consequently often sign contracts without even reading them, which can result in predatory behavior that harms consumers. Automating contract review by openly releasing high-quality data and fine-tuned models can increase access to legal support for small businesses and individuals, so that legal support is not exclusively available to wealthy companies.
To reduce the disparate societal costs of contract review, and to study how well NLP models generalize to specialized domains, the authors introduced a new large-scale dataset for contract review. As part of The Atticus Project, a non-profit organization of legal experts, CUAD is introduced, the Contract Understanding Atticus Dataset. This dataset was created with a year-long effort pushed forward by dozens of law student annotators, lawyers, and machine learning researchers. The dataset includes more than 500 contracts and more than 13,000 expert annotations that span 41 label categories. For each of 41 different labels, models must learn to highlight the portions of a contract most salient to that label. This makes the task a matter of finding needles in a haystack.
### Source Data
#### Initial Data Collection and Normalization
The CUAD includes commercial contracts selected from 25 different types of contracts based on the contract names as shown below. Within each type, the creators randomly selected contracts based on the names of the filing companies across the alphabet.
Type of Contracts: # of Docs
Affiliate Agreement: 10
Agency Agreement: 13
Collaboration/Cooperation Agreement: 26
Co-Branding Agreement: 22
Consulting Agreement: 11
Development Agreement: 29
Distributor Agreement: 32
Endorsement Agreement: 24
Franchise Agreement: 15
Hosting Agreement: 20
IP Agreement: 17
Joint Venture Agreemen: 23
License Agreement: 33
Maintenance Agreement: 34
Manufacturing Agreement: 17
Marketing Agreement: 17
Non-Compete/No-Solicit/Non-Disparagement Agreement: 3
Outsourcing Agreement: 18
Promotion Agreement: 12
Reseller Agreement: 12
Service Agreement: 28
Sponsorship Agreement: 31
Supply Agreement: 18
Strategic Alliance Agreement: 32
Transportation Agreement: 13
TOTAL: 510
#### Who are the source language producers?
The contracts were sourced from EDGAR, the Electronic Data Gathering, Analysis, and Retrieval system used at the U.S. Securities and Exchange Commission (SEC). Publicly traded companies in the United States are required to file certain contracts under the SEC rules. Access to these contracts is available to the public for free at https://www.sec.gov/edgar. Please read the Datasheet at https://www.atticusprojectai.org/ for information on the intended use and limitations of the CUAD.
### Annotations
#### Annotation process
The labeling process included multiple steps to ensure accuracy:
1. Law Student Training: law students attended training sessions on each of the categories that included a summary, video instructions by experienced attorneys, multiple quizzes and workshops. Students were then required to label sample contracts in eBrevia, an online contract review tool. The initial training took approximately 70-100 hours.
2. Law Student Label: law students conducted manual contract review and labeling in eBrevia.
3. Key Word Search: law students conducted keyword search in eBrevia to capture additional categories that have been missed during the “Student Label” step.
4. Category-by-Category Report Review: law students exported the labeled clauses into reports, review each clause category-by-category and highlight clauses that they believe are mislabeled.
5. Attorney Review: experienced attorneys reviewed the category-by-category report with students comments, provided comments and addressed student questions. When applicable, attorneys discussed such results with the students and reached consensus. Students made changes in eBrevia accordingly.
6. eBrevia Extras Review. Attorneys and students used eBrevia to generate a list of “extras”, which are clauses that eBrevia AI tool identified as responsive to a category but not labeled by human annotators. Attorneys and students reviewed all of the “extras” and added the correct ones. The process is repeated until all or substantially all of the “extras” are incorrect labels.
7. Final Report: The final report was exported into a CSV file. Volunteers manually added the “Yes/No” answer column to categories that do not contain an answer.
#### Who are the annotators?
Answered in above section.
### Personal and Sensitive Information
Some clauses in the files are redacted because the party submitting these contracts redacted them to protect confidentiality. Such redaction may show up as asterisks (\*\*\*) or underscores (\_\_\_) or blank spaces. The dataset and the answers reflect such redactions. For example, the answer for “January \_\_ 2020” would be “1/[]/2020”).
For any categories that require an answer of “Yes/No”, annotators include full sentences as text context in a contract. To maintain consistency and minimize inter-annotator disagreement, annotators select text for the full sentence, under the instruction of “from period to period”.
For the other categories, annotators selected segments of the text in the contract that are responsive to each such category. One category in a contract may include multiple labels. For example, “Parties” may include 4-10 separate text strings that are not continuous in a contract. The answer is presented in the unified format separated by semicolons of “Party A Inc. (“Party A”); Party B Corp. (“Party B”)”.
Some sentences in the files include confidential legends that are not part of the contracts. An example of such confidential legend is as follows:
THIS EXHIBIT HAS BEEN REDACTED AND IS THE SUBJECT OF A CONFIDENTIAL TREATMENT REQUEST. REDACTED MATERIAL IS MARKED WITH [* * *] AND HAS BEEN FILED SEPARATELY WITH THE SECURITIES AND EXCHANGE COMMISSION.
Some sentences in the files contain irrelevant information such as footers or page numbers. Some sentences may not be relevant to the corresponding category. Some sentences may correspond to a different category. Because many legal clauses are very long and contain various sub-parts, sometimes only a sub-part of a sentence is responsive to a category.
To address the foregoing limitations, annotators manually deleted the portion that is not responsive, replacing it with the symbol "<omitted>" to indicate that the two text segments do not appear immediately next to each other in the contracts. For example, if a “Termination for Convenience” clause starts with “Each Party may terminate this Agreement if” followed by three subparts “(a), (b) and (c)”, but only subpart (c) is responsive to this category, the authors manually deleted subparts (a) and (b) and replaced them with the symbol "<omitted>”. Another example is for “Effective Date”, the contract includes a sentence “This Agreement is effective as of the date written above” that appears after the date “January 1, 2010”. The annotation is as follows: “January 1, 2010 <omitted> This Agreement is effective as of the date written above.”
Because the contracts were converted from PDF into TXT files, the converted TXT files may not stay true to the format of the original PDF files. For example, some contracts contain inconsistent spacing between words, sentences and paragraphs. Table format is not maintained in the TXT files.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Attorney Advisors
Wei Chen, John Brockland, Kevin Chen, Jacky Fink, Spencer P. Goodson, Justin Haan, Alex Haskell, Kari Krusmark, Jenny Lin, Jonas Marson, Benjamin Petersen, Alexander Kwonji Rosenberg, William R. Sawyers, Brittany Schmeltz, Max Scott, Zhu Zhu
Law Student Leaders
John Batoha, Daisy Beckner, Lovina Consunji, Gina Diaz, Chris Gronseth, Calvin Hannagan, Joseph Kroon, Sheetal Sharma Saran
Law Student Contributors
Scott Aronin, Bryan Burgoon, Jigar Desai, Imani Haynes, Jeongsoo Kim, Margaret Lynch, Allison Melville, Felix Mendez-Burgos, Nicole Mirkazemi, David Myers, Emily Rissberger, Behrang Seraj, Sarahginy Valcin
Technical Advisors & Contributors
Dan Hendrycks, Collin Burns, Spencer Ball, Anya Chen
### Licensing Information
CUAD is licensed under the Creative Commons Attribution 4.0 (CC BY 4.0) license and free to the public for commercial and non-commercial use.
The creators make no representations or warranties regarding the license status of the underlying contracts, which are publicly available and downloadable from EDGAR.
Privacy Policy & Disclaimers
The categories or the contracts included in the dataset are not comprehensive or representative. The authors encourage the public to help improve them by sending them your comments and suggestions to info@atticusprojectai.org. Comments and suggestions will be reviewed by The Atticus Project at its discretion and will be included in future versions of Atticus categories once approved.
The use of CUAD is subject to their privacy policy https://www.atticusprojectai.org/privacy-policy and disclaimer https://www.atticusprojectai.org/disclaimer.
### Citation Information
```
@article{hendrycks2021cuad,
title={CUAD: An Expert-Annotated NLP Dataset for Legal Contract Review},
author={Dan Hendrycks and Collin Burns and Anya Chen and Spencer Ball},
journal={arXiv preprint arXiv:2103.06268},
year={2021}
}
```
### Contributions
Thanks to [@bhavitvyamalik](https://github.com/bhavitvyamalik) for adding this dataset. |
abtExp/synthetic_license_plates | ---
license: mit
---
|
stas/general-pmd-synthetic-testing | ---
license: bigscience-openrail-m
---
This dataset is designed to be used in testing. It's derived from general-pmd/localized_narratives__ADE20k dataset
The current splits are: `['100.unique', '100.repeat', '300.unique', '300.repeat', '1k.unique', '1k.repeat', '10k.unique', '10k.repeat']`.
The `unique` ones ensure uniqueness across `text` entries.
The `repeat` ones are repeating the same 10 unique records: - these are useful for memory leaks debugging as the records are always the same and thus remove the record variation from the equation.
The default split is `100.unique`
The full process of this dataset creation, including which records were used to build it, is documented inside [general-pmd-synthetic-testing.py](https://huggingface.co/datasets/HuggingFaceM4/general-pmd-synthetic-testing/blob/main/general-pmd-synthetic-testing.py)
|
hellosimple/processed_bert_dataset | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 24027462000.0
num_examples: 6674295
download_size: 5887380148
dataset_size: 24027462000.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gigant/tib_slides | ---
dataset_info:
features:
- name: Image
dtype: image
- name: file_name
dtype: string
splits:
- name: train
num_bytes: 131956494917.654
num_examples: 484843
download_size: 0
dataset_size: 131956494917.654
---
# Dataset Card for "tib_slides"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
doceoSoftware/docvqa_clicars_fitxatecnica_Mireia_270_5 | ---
dataset_info:
features:
- name: image
dtype: image
- name: query
sequence: string
- name: answers
sequence: string
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 28855807.0
num_examples: 270
- name: test
num_bytes: 482367.0
num_examples: 4
download_size: 29017641
dataset_size: 29338174.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
alvations/units | ---
license: cc0-1.0
---
This is a human translated from English list of units of measurements in multiple languages:
- Arabic
- Bengali
- Chinese (CN)
- Chinese (HK)
- Chinese (TW)
- Czech
- Dutch
- English
- French (CA)
- French (FR)
- German
- Hebrew
- Hindi
- Italian
- Japanese
- Korean
- Marathi
- Nepali
- Polish
- Portuguese (BR)
- Portuguese (PT)
- Russian
- Spanish (Latin America)
- Spanish (Mexico)
- Spanish (Spain)
- Swedish
- Turkish
- Vietnamese
- Aymara
- Nahuatl
- Indonesian
- Malay
- Thai
- Finnish
- Hungarian
- Greek
- Norwegian
- Catalan
- Tagalog
- Ukranian
# Cite
> Liling Tan (2024) LexMT: An Analysis of Machine Translation for Learners Lexicon. https://huggingface.co/datasets/alvations/
```
@article{tan-2023-lexmt-bodyparts,
title = "LexMT: An Analysis of Machine Translation for Learners Lexicon",
author = "Tan, Liling",
journal = "alvations.com",
year = "2024",
month = "Feb",
url = "https://huggingface.co/datasets/alvations/"
}
``` |
MagicHub/high-quality-prompt-dataset | ---
license: cc-by-4.0
---
|
metricv/metricsubs-chunktranslate | ---
license: mit
task_categories:
- text2text-generation
language:
- en
- zh
size_categories:
- n<1K
configs:
- config_name: default
default: true
data_files:
- split: train
path: "train.json"
- split: test
path: "test.json"
- config_name: chatgpt
data_files:
- split: train
path: "chatgpt-train.jsonl"
- split: test
path: "chatgpt-test.jsonl"
- config_name: chatgpt-recent
data_files:
- split: train
path: "chatgpt-recent-train.jsonl"
- split: test
path: "chatgpt-recent-test.jsonl"
--- |
Vinnyyw/Anahisong | ---
license: openrail
---
|
JoshVictor/medcode-FT-CuData | ---
dataset_info:
features:
- name: oneline diagnosis summary
dtype: string
- name: medical codes
dtype: string
splits:
- name: train
num_bytes: 7208
num_examples: 50
download_size: 5746
dataset_size: 7208
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kuotient/reddit_enko_translation_preference | ---
license: cc-by-nc-sa-4.0
size_categories:
- 1K<n<10K
---
## reddit_enko_translation_preference
Can be used in rlhf (CPO, DPO, etc...)
- reject: DeepL
- chosen: GPT4-Turbo
Reddit의 다양한 subreddit의 댓글과 글 번역
reject에 Deepl, chosen에 GPT4 번역이지만, GPT의 번역이 반드시 DeepL보다 좋다고 할 순 없습니다. 하고자 하는 방법에 맞춰 사용하시길 바랍니다. |
ttr12138/dogs | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: bbox
sequence:
sequence: float64
- name: categories
sequence: int64
splits:
- name: train
num_bytes: 16552.0
num_examples: 2
- name: test
num_bytes: 4279.0
num_examples: 1
download_size: 25856
dataset_size: 20831.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
pvduy/openai_summarize_tldr_human_eval_ilql_result | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: ILQL_125M
dtype: string
- name: ILQL_1B
dtype: string
- name: ILQL_6B
dtype: string
splits:
- name: train
num_bytes: 176617
num_examples: 100
download_size: 122501
dataset_size: 176617
---
# Dataset Card for "openai_summarize_tldr_human_eval_ilql_result"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TinyPixel/guanaco-m | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 15877537
num_examples: 9846
download_size: 9237302
dataset_size: 15877537
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-m"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lazylaziness/Diona | ---
license: other
---
|
theatticusproject/maud | ---
license: cc-by-4.0
---
|
tthoraldson/OasisLyrics | ---
license: cc
---
|
autoevaluate/autoeval-eval-xglue-mlqa-a70280-48375145242 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xglue
eval_info:
task: summarization
model: csebuetnlp/mT5_m2o_arabic_crossSum
metrics: ['bleu', 'f1', 'accuracy']
dataset_name: xglue
dataset_config: mlqa
dataset_split: test.ar
col_mapping:
text: context
target: question
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: csebuetnlp/mT5_m2o_arabic_crossSum
* Dataset: xglue
* Config: mlqa
* Split: test.ar
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Anwaarma](https://huggingface.co/Anwaarma) for evaluating this model. |
OptimusKoala/topachat_v2 | ---
license: apache-2.0
---
|
anan-2024/twitter_dataset_1713154698 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 86050
num_examples: 221
download_size: 50385
dataset_size: 86050
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
austenjs/ClueCorpusSmallDataset | ---
license: mit
---
|
yusuf802/leaf-images | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Apple_Black_rot
'1': Apple_Cedar_apple_rust
'2': Apple_Powdery_mildew
'3': Apple_healthy
'4': Apple_scab
'5': Cherry_(including_sour)_Powdery_mildew
'6': Cherry_(including_sour)_healthy
'7': Corn_(maize)_Cercospora_leaf_spot Gray_leaf_spot
'8': Corn_(maize)_Common_rust
'9': Corn_(maize)_Northern_Leaf_Blight
'10': Corn_(maize)_healthy
'11': Cotton_leaf_diseased
'12': Cotton_leaf_fresh
'13': Grape_Black_rot
'14': Grape___Esca_(Black_Measles)
'15': Grape___Leaf_blight_(Isariopsis_Leaf_Spot)
'16': Grape___healthy
'17': Orange_Haunglongbing_(Citrus_greening)
'18': Orange__Black_Rot
'19': Orange__Canker
'20': Orange__Healthy
'21': Peach_Bacterial_spot
'22': Peach_healthy
'23': Pepper,_bell_Bacterial_spot
'24': Pepper,_bell_healthy
'25': Potato_Early_blight
'26': Potato_Late_blight
'27': Potato_healthy
'28': Squash_Powdery_mildew
'29': Strawberry_Leaf_scorch
'30': Strawberry_healthy
'31': Tomato_Bacterial_spot
'32': Tomato_Early_blight
'33': Tomato_Late_blight
'34': Tomato_Leaf_Mold
'35': Tomato_Septoria_leaf_spot
'36': Tomato_Spider_mites_Two_spotted_spider_mite
'37': Tomato_Target_Spot
'38': Tomato_Tomato_Yellow_Leaf_Curl_Virus
'39': Tomato_Tomato_mosaic_virus
'40': Tomato_healthy
'41': Wheat_healthy
'42': Wheat_leaf_rust
'43': Wheat_nitrogen_deficiency
splits:
- name: train
num_bytes: 7355420032.737346
num_examples: 56842
- name: test
num_bytes: 1331846480.2826538
num_examples: 10032
download_size: 8653117062
dataset_size: 8687266513.02
---
# Dataset Card for "leaf-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Poupou/citizen-round-transactions | ---
license: mit
---
|
CyberHarem/hubble_neuralcloud | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hubble/ハッブル/赫波 (Neural Cloud)
This is the dataset of hubble/ハッブル/赫波 (Neural Cloud), containing 23 images and their tags.
The core tags of this character are `long_hair, breasts, blue_hair, blue_eyes, very_long_hair, bangs, large_breasts, braid, multicolored_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 50.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hubble_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 22.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hubble_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 61 | 49.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hubble_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 41.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hubble_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 61 | 75.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hubble_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hubble_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, solo, cleavage, looking_at_viewer, smile, necklace, white_dress, belt, fingerless_gloves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | looking_at_viewer | smile | necklace | white_dress | belt | fingerless_gloves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:--------|:-----------|:--------------|:-------|:--------------------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
|
huggingartists/linkin-park | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/linkin-park"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.684223 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/a865aac7693c39977b9b402dc364908e.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/linkin-park">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Linkin Park</div>
<a href="https://genius.com/artists/linkin-park">
<div style="text-align: center; font-size: 14px;">@linkin-park</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/linkin-park).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/linkin-park")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|529| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/linkin-park")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
emozilla/pg19-test-tokenized | ---
dataset_info:
features:
- name: short_book_title
dtype: string
- name: publication_date
dtype: int32
- name: url
dtype: string
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: tokenized_len
dtype: int64
splits:
- name: test
num_bytes: 97172727
num_examples: 100
download_size: 45658545
dataset_size: 97172727
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "pg19-test-tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingartists/ghostemane | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/ghostemane"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.448728 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/c4407bb331c50916c1dfdc7f875f73a9.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/ghostemane">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ghostemane</div>
<a href="https://genius.com/artists/ghostemane">
<div style="text-align: center; font-size: 14px;">@ghostemane</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/ghostemane).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/ghostemane")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|327| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/ghostemane")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
CyberHarem/inoue_orihime_bleach | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Inoue Orihime/井上織姫/黒崎織姫 (Bleach)
This is the dataset of Inoue Orihime/井上織姫/黒崎織姫 (Bleach), containing 500 images and their tags.
The core tags of this character are `long_hair, orange_hair, breasts, hair_ornament, large_breasts, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 599.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inoue_orihime_bleach/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 364.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inoue_orihime_bleach/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1151 | 728.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inoue_orihime_bleach/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 535.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inoue_orihime_bleach/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1151 | 990.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inoue_orihime_bleach/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/inoue_orihime_bleach',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, school_uniform, skirt, smile, solo, socks |
| 1 | 8 |  |  |  |  |  | 1girl, school_uniform, solo, open_mouth, skirt, sweater_vest |
| 2 | 7 |  |  |  |  |  | 1girl, bow, school_uniform, smile, solo, open_mouth, blush, closed_eyes, hairpin |
| 3 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, smile, blush, long_sleeves, dress, hairpin, upper_body |
| 4 | 7 |  |  |  |  |  | 1girl, solo, looking_at_viewer, upper_body, watermark, web_address, open_mouth, sweater |
| 5 | 7 |  |  |  |  |  | 1girl, solo, open_mouth, pink_kimono, grey_eyes, obi, smile, hairpin |
| 6 | 6 |  |  |  |  |  | 1girl, cleavage_cutout, midriff, solo, looking_at_viewer, navel, crop_top, skirt, smile, long_sleeves |
| 7 | 8 |  |  |  |  |  | 1girl, dress, solo, grey_eyes, bare_shoulders, elbow_gloves |
| 8 | 6 |  |  |  |  |  | 1girl, collared_shirt, red_bowtie, school_uniform, solo, white_shirt, bangs, jacket, looking_at_viewer, upper_body, :d, brown_hair, long_sleeves, open_mouth, twitter_username, white_background, upper_teeth_only |
| 9 | 6 |  |  |  |  |  | 1girl, christmas, santa_costume, santa_hat, solo, grey_eyes, looking_at_viewer, open_mouth, red_dress, black_background, full_body, knee_boots, red_footwear, santa_boots, :d, fur-trimmed_dress, simple_background |
| 10 | 5 |  |  |  |  |  | 1girl, cupcake, hat, solo, food_on_face, :q, blush, bow |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | school_uniform | skirt | smile | solo | socks | open_mouth | sweater_vest | bow | blush | closed_eyes | hairpin | looking_at_viewer | long_sleeves | dress | upper_body | watermark | web_address | sweater | pink_kimono | grey_eyes | obi | cleavage_cutout | midriff | navel | crop_top | bare_shoulders | elbow_gloves | collared_shirt | red_bowtie | white_shirt | bangs | jacket | :d | brown_hair | twitter_username | white_background | upper_teeth_only | christmas | santa_costume | santa_hat | red_dress | black_background | full_body | knee_boots | red_footwear | santa_boots | fur-trimmed_dress | simple_background | cupcake | hat | food_on_face | :q |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:--------|:--------|:-------|:--------|:-------------|:---------------|:------|:--------|:--------------|:----------|:--------------------|:---------------|:--------|:-------------|:------------|:--------------|:----------|:--------------|:------------|:------|:------------------|:----------|:--------|:-----------|:-----------------|:---------------|:-----------------|:-------------|:--------------|:--------|:---------|:-----|:-------------|:-------------------|:-------------------|:-------------------|:------------|:----------------|:------------|:------------|:-------------------|:------------|:-------------|:---------------|:--------------|:--------------------|:--------------------|:----------|:------|:---------------|:-----|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | | X | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | | X | X | | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | | X | | X | | | | | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | | X | X | | X | | | | | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | X | X | X | | | | | | | | X | X | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 8 |  |  |  |  |  | X | | | | X | | | | | | | | | | X | | | | | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | X | | | X | | X | | | | | | X | X | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | | | | X | | X | | | | | | X | | | | | | | | X | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 10 | 5 |  |  |  |  |  | X | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X |
|
liuyanchen1015/MULTI_VALUE_qqp_give_passive | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 50869
num_examples: 265
- name: test
num_bytes: 494406
num_examples: 2672
- name: train
num_bytes: 480669
num_examples: 2514
download_size: 603887
dataset_size: 1025944
---
# Dataset Card for "MULTI_VALUE_qqp_give_passive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nelson2424/FAQ_NelsMarketplace | ---
license: mit
task_categories:
- question-answering
language:
- en
tags:
- finance
dataset_info:
features:
- name: Instruction
dtype: string
- name: Question
dtype: string
- name: Context/Answer
dtype: string
splits:
- name: train
num_bytes: 121719
num_examples: 84
download_size: 25676
dataset_size: 121719
---
This dataset was created to test two different things:
First, check LLM's capabilities of augmenting data in a coherent way.
Second, create a dataset to finetune LLMs for the QA task.
The dataset contains the frequently asked questions and their answers of a made-up online fashion marketplace called: Nels Marketplace. |
irds/wikiclir_tr | ---
pretty_name: '`wikiclir/tr`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `wikiclir/tr`
The `wikiclir/tr` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/wikiclir#wikiclir/tr).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=295,593
- `queries` (i.e., topics); count=185,388
- `qrels`: (relevance assessments); count=380,651
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/wikiclir_tr', 'docs')
for record in docs:
record # {'doc_id': ..., 'title': ..., 'text': ...}
queries = load_dataset('irds/wikiclir_tr', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/wikiclir_tr', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{sasaki-etal-2018-cross,
title = "Cross-Lingual Learning-to-Rank with Shared Representations",
author = "Sasaki, Shota and
Sun, Shuo and
Schamoni, Shigehiko and
Duh, Kevin and
Inui, Kentaro",
booktitle = "Proceedings of the 2018 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)",
month = jun,
year = "2018",
address = "New Orleans, Louisiana",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N18-2073",
doi = "10.18653/v1/N18-2073",
pages = "458--463"
}
```
|
loubnabnl/comsop_450_samples_detailed | ---
language:
- en
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: text_token_length
dtype: int64
- name: original_text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
- name: generated_samples
sequence: string
- name: evaluation_prompt
dtype: string
- name: sentences
sequence: string
- name: completion
dtype: string
- name: token_length
dtype: int64
- name: passage_score
dtype: float64
- name: sentences_and_scores
list:
- name: score
dtype: float64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 13709325
num_examples: 450
download_size: 8178189
dataset_size: 13709325
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Tensoic/Bhandara | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 81417796
num_examples: 12395
download_size: 21196767
dataset_size: 81417796
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text-generation
language:
- hi
tags:
- pretrain
---
## A Pretraining Hindi Dataset for Diverse Indian NLP Tasks
This dataset contains over 12,000 rows and 7 million words of text specifically generated for pretraining NLP models on Hindi language tasks. It was created using the Bard API, ensuring high-quality and diverse content.
## Key Feature: Rich India-Specific Data
A distinguishing characteristic of this dataset is its inclusion of a substantial amount of content related to India. This makes it valuable for training models that need to understand and respond to nuances specific to the Indian context, culture, and language.
## Caution
This dataset includes a wide variety of data, but the accuracy and factuality of all information haven't been verified. |
japanese-asr/whisper_transcriptions.reazonspeech.medium.wer_10.0.vectorized | ---
dataset_info:
config_name: medium
features:
- name: input_length
dtype: int64
- name: labels
sequence: int64
- name: input_features
sequence:
sequence: float32
splits:
- name: train
num_bytes: 320730340056
num_examples: 208714
download_size: 59337474292
dataset_size: 320730340056
configs:
- config_name: medium
data_files:
- split: train
path: medium/train-*
---
|
liuyanchen1015/VALUE_rte_been_done | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 44547
num_examples: 105
- name: test
num_bytes: 492663
num_examples: 1175
- name: train
num_bytes: 438233
num_examples: 990
download_size: 637646
dataset_size: 975443
---
# Dataset Card for "VALUE_rte_been_done"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dhbloo/TuSimple | ---
license: apache-2.0
---
Dataset from: https://github.com/TuSimple/tusimple-benchmark |
davidberg/inflation | ---
license: apache-2.0
---
|
vpetukhov/bible_tts_hausa | ---
annotations_creators: []
language:
- ha
language_creators:
- expert-generated
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
pretty_name: BibleTTS Hausa
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- bible
task_categories:
- automatic-speech-recognition
- text-to-speech
task_ids: []
---
# Dataset Card for BibleTTS Hausa
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://masakhane-io.github.io/bibleTTS/
- **Repository:** http://www.openslr.org/129/
- **Paper:** https://arxiv.org/abs/2207.03546
### Dataset Summary
BibleTTS is a large high-quality open Text-to-Speech dataset with up to 80 hours of single speaker, studio quality 48kHz recordings.
This is a Hausa part of the dataset. Aligned hours: 86.6, aligned verses: 40,603.
### Languages
Hausa
## Dataset Structure
### Data Fields
- `audio`: audio path
- `sentence`: transcription of the audio
- `locale`: always set to `ha`
- `book`: 3-char book encoding
- `verse`: verse id
### Data Splits
- `dev`: Book of Ezra (264 verses)
- `test`: Book of Colossians (124 verses)
- `train`: all other books (40215 verses)
## Additional Information
*See [this notebook](https://github.com/seads-org/hausa-speech-recognition/blob/6993c5c74379c93a2416acac6126b60ce6e52df8/notebooks/prepare_bible_dataset.ipynb) for the code on how the dataset was processed.
### Dataset Curators
The dataset uploaded by [vpetukhov](https://github.com/VPetukhov/) who is not connected to the dataset authors. Please, see the project page for more info.
### Licensing Information
The data is released under a commercial-friendly [CC-BY-SA](https://creativecommons.org/licenses/by-sa/4.0/) license.
### Citation Information
Meyer, Josh, et al. "BibleTTS: a large, high-fidelity, multilingual, and uniquely African speech corpus." arXiv preprint arXiv:2207.03546 (2022).
|
omerist/turknews-mini | ---
dataset_info:
features:
- name: review
dtype: string
- name: title
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 9064933.18105424
num_examples: 3534
- name: validation
num_bytes: 1008069.8189457601
num_examples: 393
download_size: 5732599
dataset_size: 10073003.0
---
# Dataset Card for "turknews-mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
santoshtyss/canadian_court_cases | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 150955544
num_examples: 8486
- name: validation
num_bytes: 24710969
num_examples: 1400
download_size: 87658645
dataset_size: 175666513
---
# Dataset Card for "canadian_court_cases"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_uukuguy__speechless-mistral-six-in-one-7b-orth-1.0 | ---
pretty_name: Evaluation run of uukuguy/speechless-mistral-six-in-one-7b-orth-1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-mistral-six-in-one-7b-orth-1.0](https://huggingface.co/uukuguy/speechless-mistral-six-in-one-7b-orth-1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-mistral-six-in-one-7b-orth-1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-13T11:13:22.485134](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-mistral-six-in-one-7b-orth-1.0/blob/main/results_2023-12-13T11-13-22.485134.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n\
\ \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n\
\ \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \
\ \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n\
\ },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n\
\ \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n\
\ \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n\
\ \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n\
\ },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n\
\ \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n\
\ \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n\
\ \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n\
\ \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n\
\ \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n\
\ \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n\
\ \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n\
\ \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"\
acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n\
\ \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"\
acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n\
\ \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-mistral-six-in-one-7b-orth-1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|arc:challenge|25_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|gsm8k|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hellaswag|10_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T11-13-22.485134.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T11-13-22.485134.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- '**/details_harness|winogrande|5_2023-12-13T11-13-22.485134.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-13T11-13-22.485134.parquet'
- config_name: results
data_files:
- split: 2023_12_13T11_13_22.485134
path:
- results_2023-12-13T11-13-22.485134.parquet
- split: latest
path:
- results_2023-12-13T11-13-22.485134.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-mistral-six-in-one-7b-orth-1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [uukuguy/speechless-mistral-six-in-one-7b-orth-1.0](https://huggingface.co/uukuguy/speechless-mistral-six-in-one-7b-orth-1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-mistral-six-in-one-7b-orth-1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T11:13:22.485134](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-mistral-six-in-one-7b-orth-1.0/blob/main/results_2023-12-13T11-13-22.485134.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
izou3/New_Instance_Seg_DB | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: annotation
dtype: image
splits:
- name: train
num_bytes: 8716961.0
num_examples: 453
- name: test
num_bytes: 3899703.0
num_examples: 133
download_size: 12403234
dataset_size: 12616664.0
---
# Dataset Card for "New_Instance_Seg_DB"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cpalenmichel/kmr-bible | ---
dataset_info:
features:
- name: text
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 2149151663.368
num_examples: 2754
- name: test
num_bytes: 241863045.0
num_examples: 327
- name: valid
num_bytes: 255423909.0
num_examples: 334
download_size: 2545514213
dataset_size: 2646438617.368
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
tiennv/gaze-following-test | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: width
dtype: int64
- name: height
dtype: int64
- name: bboxes
dtype: string
- name: labels
dtype: string
- name: cab
dtype: int64
- name: hum
dtype: int64
- name: light
dtype: float64
- name: cam
dtype: int64
- name: env
dtype: int64
- name: gaze_item
dtype: int64
- name: gazeIdx
dtype: int64
- name: gaze_cx
dtype: int64
- name: gaze_cy
dtype: int64
- name: hx
dtype: int64
- name: hy
dtype: int64
- name: pitch
dtype: float64
- name: yaw
dtype: float64
- name: roll
dtype: float64
- name: seg
dtype: string
- name: segm_gazeIdx
dtype: int64
- name: occluded
dtype: int64
splits:
- name: train
num_bytes: 11133726929.8
num_examples: 19200
download_size: 11101174289
dataset_size: 11133726929.8
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gaze-following-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TriadParty/deepsex-RP | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_ibivibiv__multimaster-7b-v2 | ---
pretty_name: Evaluation run of ibivibiv/multimaster-7b-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ibivibiv/multimaster-7b-v2](https://huggingface.co/ibivibiv/multimaster-7b-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ibivibiv__multimaster-7b-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T20:13:08.310124](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__multimaster-7b-v2/blob/main/results_2024-02-02T20-13-08.310124.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6563425377048554,\n\
\ \"acc_stderr\": 0.0318718255323794,\n \"acc_norm\": 0.6556612698627262,\n\
\ \"acc_norm_stderr\": 0.03254066149146999,\n \"mc1\": 0.4394124847001224,\n\
\ \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.606301955480099,\n\
\ \"mc2_stderr\": 0.01544904749324005\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6723549488054608,\n \"acc_stderr\": 0.013715847940719339,\n\
\ \"acc_norm\": 0.7047781569965871,\n \"acc_norm_stderr\": 0.013329750293382318\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6995618402708623,\n\
\ \"acc_stderr\": 0.004575116093931906,\n \"acc_norm\": 0.8759211312487553,\n\
\ \"acc_norm_stderr\": 0.0032899775233939097\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n\
\ \"acc_stderr\": 0.016623998513333106,\n \"acc_norm\": 0.44581005586592176,\n\
\ \"acc_norm_stderr\": 0.016623998513333106\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.012740853872949834,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.012740853872949834\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6928104575163399,\n \"acc_stderr\": 0.01866335967146367,\n \
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.01866335967146367\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4394124847001224,\n\
\ \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.606301955480099,\n\
\ \"mc2_stderr\": 0.01544904749324005\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598482\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7187263078089462,\n \
\ \"acc_stderr\": 0.012384789310940244\n }\n}\n```"
repo_url: https://huggingface.co/ibivibiv/multimaster-7b-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|arc:challenge|25_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|gsm8k|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hellaswag|10_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T20-13-08.310124.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T20-13-08.310124.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- '**/details_harness|winogrande|5_2024-02-02T20-13-08.310124.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T20-13-08.310124.parquet'
- config_name: results
data_files:
- split: 2024_02_02T20_13_08.310124
path:
- results_2024-02-02T20-13-08.310124.parquet
- split: latest
path:
- results_2024-02-02T20-13-08.310124.parquet
---
# Dataset Card for Evaluation run of ibivibiv/multimaster-7b-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ibivibiv/multimaster-7b-v2](https://huggingface.co/ibivibiv/multimaster-7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ibivibiv__multimaster-7b-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T20:13:08.310124](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__multimaster-7b-v2/blob/main/results_2024-02-02T20-13-08.310124.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6563425377048554,
"acc_stderr": 0.0318718255323794,
"acc_norm": 0.6556612698627262,
"acc_norm_stderr": 0.03254066149146999,
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.606301955480099,
"mc2_stderr": 0.01544904749324005
},
"harness|arc:challenge|25": {
"acc": 0.6723549488054608,
"acc_stderr": 0.013715847940719339,
"acc_norm": 0.7047781569965871,
"acc_norm_stderr": 0.013329750293382318
},
"harness|hellaswag|10": {
"acc": 0.6995618402708623,
"acc_stderr": 0.004575116093931906,
"acc_norm": 0.8759211312487553,
"acc_norm_stderr": 0.0032899775233939097
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055266,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.016623998513333106,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.016623998513333106
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.012740853872949834,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.012740853872949834
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.01866335967146367,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.01866335967146367
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.606301955480099,
"mc2_stderr": 0.01544904749324005
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598482
},
"harness|gsm8k|5": {
"acc": 0.7187263078089462,
"acc_stderr": 0.012384789310940244
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ArunSharmaaaaa/business_for_chatbot | ---
license: apache-2.0
---
|
yfan1997/MultipanelVQA_synthetic | ---
license: cc-by-4.0
---
**Synthetic data in MultipanelVQA**
Paper: Muffin or Chihuahua? Challenging Large Vision-Language Models with Multipanel VQA [(arXiv)](https://arxiv.org/abs/2401.15847)
Website: [https://sites.google.com/view/multipanelvqa/home](https://sites.google.com/view/multipanelvqa/home)
MultipanelVQA includes both [real-world data](https://huggingface.co/datasets/yfan1997/MultipanelVQA_real-world) and synthetic data. |
nyuuzyou/PM-products | ---
annotations_creators:
- crowdsourced
language:
- ru
language_creators:
- crowdsourced
license:
- cc0-1.0
multilinguality:
- monolingual
pretty_name: PochtaMarket products
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
---
# Dataset Card for PochtaMarket products
### Dataset Summary
This dataset was scraped from product pages on the Russian marketplace [PochtaMarket](https://market.pochta.ru). It includes all information from the product card. The dataset was collected by processing around 500 thousand, starting from the first one. At the time the dataset was collected, it is assumed that these were all the products available on this marketplace. Some fields may be empty, but the string is expected to contain some data, empty responses have been sorted.
### Languages
The dataset is mostly in Russian, but there may be other languages present.
## Dataset Structure
### Data Fields
This dataset includes the following fields:
- `id`: Identifier for the product (integer)
- `name`: Name of the product (string)
- `description`: Short description of the product (string)
- `longDescription`: Detailed description of the product (string)
- `seoKeywords`: Search engine optimization keywords for the product (string)
- `brand`: Brand name associated with the product (string)
- `providerName`: Name of the provider or seller (string)
### Data Splits
All examples are in the train split, there is no validation split.
## Additional Information
### License
This dataset is dedicated to the public domain under the Creative Commons Zero (CC0) license. This means you can:
* Use it for any purpose, including commercial projects.
* Modify it however you like.
* Distribute it without asking permission.
No attribution is required, but it's always appreciated!
CC0 license: https://creativecommons.org/publicdomain/zero/1.0/deed.en
To learn more about CC0, visit the Creative Commons website: https://creativecommons.org/publicdomain/zero/1.0/
### Dataset Curators
- [nyuuzyou](https://ducks.party)
|
CyberHarem/minami_yume | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Minami Yume/南夢芽/南梦芽
This is the dataset of Minami Yume/南夢芽/南梦芽, containing 408 images and their tags.
The core tags of this character are `long_hair, green_eyes, brown_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 408 | 545.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minami_yume/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 408 | 288.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minami_yume/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 992 | 617.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minami_yume/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 408 | 471.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minami_yume/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 992 | 905.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minami_yume/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/minami_yume',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, necktie, pink_shirt, solo, simple_background, black_skirt, bracelet, pleated_skirt, school_uniform, smile, white_background, short_sleeves, socks, brown_footwear, loafers, sitting |
| 1 | 8 |  |  |  |  |  | 1girl, brown_footwear, long_sleeves, looking_at_viewer, pleated_skirt, school_uniform, black_pantyhose, blazer, solo, black_jacket, loafers, white_background, black_skirt, collared_shirt, full_body, simple_background, white_shirt, blue_skirt, closed_mouth, holding, sitting, squatting, sweater |
| 2 | 15 |  |  |  |  |  | 1girl, black_pantyhose, blazer, pleated_skirt, school_uniform, solo, black_jacket, black_skirt, long_sleeves, looking_at_viewer, open_jacket, closed_mouth, collared_shirt, white_shirt, miniskirt, sweater, standing, cowboy_shot, backpack, simple_background |
| 3 | 8 |  |  |  |  |  | 1girl, black_pantyhose, looking_at_viewer, pleated_skirt, school_uniform, solo, blazer, long_sleeves, black_jacket, bag |
| 4 | 13 |  |  |  |  |  | 1girl, collared_shirt, solo, white_shirt, looking_at_viewer, school_uniform, blazer, upper_body, black_jacket, simple_background, closed_mouth, open_jacket, white_background, long_sleeves, sweater |
| 5 | 6 |  |  |  |  |  | 1girl, school_uniform, solo, looking_at_viewer, no_shoes, panties_under_pantyhose, soles, toes, ass, black_pantyhose, foot_focus, long_sleeves, looking_back, lying, pleated_skirt, thighband_pantyhose, thighs |
| 6 | 30 |  |  |  |  |  | 1girl, looking_at_viewer, solo, short_shorts, medium_breasts, navel, pink_bikini, cleavage, simple_background, white_bikini, bare_shoulders, collarbone, smile, white_background, blue_shorts, denim_shorts, blush, open_mouth |
| 7 | 5 |  |  |  |  |  | detached_collar, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, wrist_cuffs, 1girl, rabbit_tail, black_leotard, blush, cleavage, solo, strapless_leotard, bare_shoulders, black_bowtie, black_pantyhose, blonde_hair, fake_tail, fishnets, green_background, high_heels, multiple_girls, open_mouth, simple_background, small_breasts, smile, standing, white_background, white_footwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | necktie | pink_shirt | solo | simple_background | black_skirt | bracelet | pleated_skirt | school_uniform | smile | white_background | short_sleeves | socks | brown_footwear | loafers | sitting | long_sleeves | black_pantyhose | blazer | black_jacket | collared_shirt | full_body | white_shirt | blue_skirt | closed_mouth | holding | squatting | sweater | open_jacket | miniskirt | standing | cowboy_shot | backpack | bag | upper_body | no_shoes | panties_under_pantyhose | soles | toes | ass | foot_focus | looking_back | lying | thighband_pantyhose | thighs | short_shorts | medium_breasts | navel | pink_bikini | cleavage | white_bikini | bare_shoulders | collarbone | blue_shorts | denim_shorts | blush | open_mouth | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | wrist_cuffs | rabbit_tail | black_leotard | strapless_leotard | black_bowtie | blonde_hair | fake_tail | fishnets | green_background | high_heels | multiple_girls | small_breasts | white_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:----------|:-------------|:-------|:--------------------|:--------------|:-----------|:----------------|:-----------------|:--------|:-------------------|:----------------|:--------|:-----------------|:----------|:----------|:---------------|:------------------|:---------|:---------------|:-----------------|:------------|:--------------|:-------------|:---------------|:----------|:------------|:----------|:--------------|:------------|:-----------|:--------------|:-----------|:------|:-------------|:-----------|:--------------------------|:--------|:-------|:------|:-------------|:---------------|:--------|:----------------------|:---------|:---------------|:-----------------|:--------|:--------------|:-----------|:---------------|:-----------------|:-------------|:--------------|:---------------|:--------|:-------------|:------------------|:-------------------|:----------------|:--------------|:--------------|:--------------|:----------------|:--------------------|:---------------|:--------------|:------------|:-----------|:-------------------|:-------------|:-----------------|:----------------|:-----------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | | | X | X | X | | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | X | | | X | X | X | | X | X | | | | | | | | X | X | X | X | X | | X | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | | | X | | | | X | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | X | X | | | X | X | | | | X | | X | | | | | | X | | X | X | X | | X | | X | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | | | X | | | | X | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 30 |  |  |  |  |  | X | X | | | X | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | | | X | X | | | | | X | X | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_TeeZee__Buttocks-7B-v1.1 | ---
pretty_name: Evaluation run of TeeZee/Buttocks-7B-v1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TeeZee/Buttocks-7B-v1.1](https://huggingface.co/TeeZee/Buttocks-7B-v1.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__Buttocks-7B-v1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T01:11:42.327432](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__Buttocks-7B-v1.1/blob/main/results_2024-01-25T01-11-42.327432.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.49958748050002727,\n\
\ \"acc_stderr\": 0.03449947558483939,\n \"acc_norm\": 0.5072913093747228,\n\
\ \"acc_norm_stderr\": 0.03532795103647748,\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.015983595101811392,\n \"mc2\": 0.4472415883922134,\n\
\ \"mc2_stderr\": 0.015128282783775687\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5213310580204779,\n \"acc_stderr\": 0.014598087973127106,\n\
\ \"acc_norm\": 0.5460750853242321,\n \"acc_norm_stderr\": 0.01454922110517187\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.578868751244772,\n\
\ \"acc_stderr\": 0.004927314729433553,\n \"acc_norm\": 0.7561242780322645,\n\
\ \"acc_norm_stderr\": 0.004285410130466104\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.040335656678483184,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.040335656678483184\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.03794012674697029,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.03794012674697029\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.024757473902752035,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.024757473902752035\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6064516129032258,\n \"acc_stderr\": 0.027791878753132267,\n \"\
acc_norm\": 0.6064516129032258,\n \"acc_norm_stderr\": 0.027791878753132267\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39408866995073893,\n \"acc_stderr\": 0.034381579670365446,\n \"\
acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.034381579670365446\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512568,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512568\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244441,\n \"\
acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244441\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.689119170984456,\n \"acc_stderr\": 0.03340361906276586,\n\
\ \"acc_norm\": 0.689119170984456,\n \"acc_norm_stderr\": 0.03340361906276586\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.025339003010106522,\n\
\ \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.025339003010106522\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.03238546948758979,\n \
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.03238546948758979\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6990825688073394,\n \"acc_stderr\": 0.019664751366802114,\n \"\
acc_norm\": 0.6990825688073394,\n \"acc_norm_stderr\": 0.019664751366802114\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6127450980392157,\n \"acc_stderr\": 0.03418931233833342,\n \"\
acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.03418931233833342\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6413502109704642,\n \"acc_stderr\": 0.031219569445301833,\n \
\ \"acc_norm\": 0.6413502109704642,\n \"acc_norm_stderr\": 0.031219569445301833\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n\
\ \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.5605381165919282,\n\
\ \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262971,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262971\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292535,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292535\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.03874102859818081,\n\
\ \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.03874102859818081\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935437,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935437\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6411238825031929,\n\
\ \"acc_stderr\": 0.017152991797501342,\n \"acc_norm\": 0.6411238825031929,\n\
\ \"acc_norm_stderr\": 0.017152991797501342\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n \
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.014219570788103986,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.014219570788103986\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5032679738562091,\n \"acc_stderr\": 0.028629305194003543,\n\
\ \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.028629305194003543\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n\
\ \"acc_stderr\": 0.027982680459759567,\n \"acc_norm\": 0.5852090032154341,\n\
\ \"acc_norm_stderr\": 0.027982680459759567\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5123456790123457,\n \"acc_stderr\": 0.027812262269327242,\n\
\ \"acc_norm\": 0.5123456790123457,\n \"acc_norm_stderr\": 0.027812262269327242\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573086,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573086\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3852672750977836,\n\
\ \"acc_stderr\": 0.012429485434955182,\n \"acc_norm\": 0.3852672750977836,\n\
\ \"acc_norm_stderr\": 0.012429485434955182\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004137,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004137\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4820261437908497,\n \"acc_stderr\": 0.020214761037872408,\n \
\ \"acc_norm\": 0.4820261437908497,\n \"acc_norm_stderr\": 0.020214761037872408\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5346938775510204,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.5346938775510204,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495302,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6374269005847953,\n \"acc_stderr\": 0.036871306155620606,\n\
\ \"acc_norm\": 0.6374269005847953,\n \"acc_norm_stderr\": 0.036871306155620606\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.015983595101811392,\n \"mc2\": 0.4472415883922134,\n\
\ \"mc2_stderr\": 0.015128282783775687\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6890292028413575,\n \"acc_stderr\": 0.013009534736286058\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0576194086429113,\n \
\ \"acc_stderr\": 0.006418593319822861\n }\n}\n```"
repo_url: https://huggingface.co/TeeZee/Buttocks-7B-v1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|arc:challenge|25_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|gsm8k|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hellaswag|10_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T01-11-42.327432.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T01-11-42.327432.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- '**/details_harness|winogrande|5_2024-01-25T01-11-42.327432.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T01-11-42.327432.parquet'
- config_name: results
data_files:
- split: 2024_01_25T01_11_42.327432
path:
- results_2024-01-25T01-11-42.327432.parquet
- split: latest
path:
- results_2024-01-25T01-11-42.327432.parquet
---
# Dataset Card for Evaluation run of TeeZee/Buttocks-7B-v1.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/Buttocks-7B-v1.1](https://huggingface.co/TeeZee/Buttocks-7B-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__Buttocks-7B-v1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T01:11:42.327432](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__Buttocks-7B-v1.1/blob/main/results_2024-01-25T01-11-42.327432.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.49958748050002727,
"acc_stderr": 0.03449947558483939,
"acc_norm": 0.5072913093747228,
"acc_norm_stderr": 0.03532795103647748,
"mc1": 0.2962056303549572,
"mc1_stderr": 0.015983595101811392,
"mc2": 0.4472415883922134,
"mc2_stderr": 0.015128282783775687
},
"harness|arc:challenge|25": {
"acc": 0.5213310580204779,
"acc_stderr": 0.014598087973127106,
"acc_norm": 0.5460750853242321,
"acc_norm_stderr": 0.01454922110517187
},
"harness|hellaswag|10": {
"acc": 0.578868751244772,
"acc_stderr": 0.004927314729433553,
"acc_norm": 0.7561242780322645,
"acc_norm_stderr": 0.004285410130466104
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.040335656678483184,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.040335656678483184
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5358490566037736,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.5358490566037736,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.03794012674697029,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.03794012674697029
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.024757473902752035,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.024757473902752035
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6064516129032258,
"acc_stderr": 0.027791878753132267,
"acc_norm": 0.6064516129032258,
"acc_norm_stderr": 0.027791878753132267
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.034381579670365446,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.034381579670365446
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512568,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512568
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244441,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244441
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.689119170984456,
"acc_stderr": 0.03340361906276586,
"acc_norm": 0.689119170984456,
"acc_norm_stderr": 0.03340361906276586
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4846153846153846,
"acc_stderr": 0.025339003010106522,
"acc_norm": 0.4846153846153846,
"acc_norm_stderr": 0.025339003010106522
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.03238546948758979,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.03238546948758979
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6990825688073394,
"acc_stderr": 0.019664751366802114,
"acc_norm": 0.6990825688073394,
"acc_norm_stderr": 0.019664751366802114
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.03418931233833342,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.03418931233833342
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6413502109704642,
"acc_stderr": 0.031219569445301833,
"acc_norm": 0.6413502109704642,
"acc_norm_stderr": 0.031219569445301833
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5605381165919282,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.5605381165919282,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262971,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262971
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292535,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292535
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5828220858895705,
"acc_stderr": 0.03874102859818081,
"acc_norm": 0.5828220858895705,
"acc_norm_stderr": 0.03874102859818081
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280042,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280042
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935437,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935437
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6411238825031929,
"acc_stderr": 0.017152991797501342,
"acc_norm": 0.6411238825031929,
"acc_norm_stderr": 0.017152991797501342
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5,
"acc_stderr": 0.026919095102908273,
"acc_norm": 0.5,
"acc_norm_stderr": 0.026919095102908273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.014219570788103986,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.014219570788103986
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.028629305194003543,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.028629305194003543
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5852090032154341,
"acc_stderr": 0.027982680459759567,
"acc_norm": 0.5852090032154341,
"acc_norm_stderr": 0.027982680459759567
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5123456790123457,
"acc_stderr": 0.027812262269327242,
"acc_norm": 0.5123456790123457,
"acc_norm_stderr": 0.027812262269327242
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573086,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573086
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3852672750977836,
"acc_stderr": 0.012429485434955182,
"acc_norm": 0.3852672750977836,
"acc_norm_stderr": 0.012429485434955182
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004137,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004137
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4820261437908497,
"acc_stderr": 0.020214761037872408,
"acc_norm": 0.4820261437908497,
"acc_norm_stderr": 0.020214761037872408
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5346938775510204,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.5346938775510204,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495302,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6374269005847953,
"acc_stderr": 0.036871306155620606,
"acc_norm": 0.6374269005847953,
"acc_norm_stderr": 0.036871306155620606
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2962056303549572,
"mc1_stderr": 0.015983595101811392,
"mc2": 0.4472415883922134,
"mc2_stderr": 0.015128282783775687
},
"harness|winogrande|5": {
"acc": 0.6890292028413575,
"acc_stderr": 0.013009534736286058
},
"harness|gsm8k|5": {
"acc": 0.0576194086429113,
"acc_stderr": 0.006418593319822861
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MobeenHameed/khan1 | ---
license: mit
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 23646083.0
num_examples: 14
download_size: 23296387
dataset_size: 23646083.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_DreadPoor__WhyAreWeStillHere-7B-slerp | ---
pretty_name: Evaluation run of DreadPoor/WhyAreWeStillHere-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DreadPoor/WhyAreWeStillHere-7B-slerp](https://huggingface.co/DreadPoor/WhyAreWeStillHere-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__WhyAreWeStillHere-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-17T03:56:00.783844](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__WhyAreWeStillHere-7B-slerp/blob/main/results_2024-02-17T03-56-00.783844.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6545570581109811,\n\
\ \"acc_stderr\": 0.032029886164546564,\n \"acc_norm\": 0.6542595014266129,\n\
\ \"acc_norm_stderr\": 0.03270002296417162,\n \"mc1\": 0.5385556915544676,\n\
\ \"mc1_stderr\": 0.017451384104637452,\n \"mc2\": 0.6812168345875693,\n\
\ \"mc2_stderr\": 0.015141577387322332\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6988054607508533,\n \"acc_stderr\": 0.013406741767847632,\n\
\ \"acc_norm\": 0.7167235494880546,\n \"acc_norm_stderr\": 0.013167478735134575\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7187811192989444,\n\
\ \"acc_stderr\": 0.004486752200430352,\n \"acc_norm\": 0.8824935271858195,\n\
\ \"acc_norm_stderr\": 0.0032136470410029467\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138208,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138208\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.02315787934908352,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.02315787934908352\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849929,\n \"\
acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849929\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250458,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250458\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.03322015795776741,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.03322015795776741\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992002,\n\
\ \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992002\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46033519553072627,\n\
\ \"acc_stderr\": 0.01666979959211203,\n \"acc_norm\": 0.46033519553072627,\n\
\ \"acc_norm_stderr\": 0.01666979959211203\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.0127397115540457,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.0127397115540457\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5385556915544676,\n\
\ \"mc1_stderr\": 0.017451384104637452,\n \"mc2\": 0.6812168345875693,\n\
\ \"mc2_stderr\": 0.015141577387322332\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8547750591949487,\n \"acc_stderr\": 0.009902153904760824\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6535253980288097,\n \
\ \"acc_stderr\": 0.013107179054313403\n }\n}\n```"
repo_url: https://huggingface.co/DreadPoor/WhyAreWeStillHere-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|arc:challenge|25_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|gsm8k|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hellaswag|10_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T03-56-00.783844.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T03-56-00.783844.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- '**/details_harness|winogrande|5_2024-02-17T03-56-00.783844.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-17T03-56-00.783844.parquet'
- config_name: results
data_files:
- split: 2024_02_17T03_56_00.783844
path:
- results_2024-02-17T03-56-00.783844.parquet
- split: latest
path:
- results_2024-02-17T03-56-00.783844.parquet
---
# Dataset Card for Evaluation run of DreadPoor/WhyAreWeStillHere-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/WhyAreWeStillHere-7B-slerp](https://huggingface.co/DreadPoor/WhyAreWeStillHere-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__WhyAreWeStillHere-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T03:56:00.783844](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__WhyAreWeStillHere-7B-slerp/blob/main/results_2024-02-17T03-56-00.783844.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6545570581109811,
"acc_stderr": 0.032029886164546564,
"acc_norm": 0.6542595014266129,
"acc_norm_stderr": 0.03270002296417162,
"mc1": 0.5385556915544676,
"mc1_stderr": 0.017451384104637452,
"mc2": 0.6812168345875693,
"mc2_stderr": 0.015141577387322332
},
"harness|arc:challenge|25": {
"acc": 0.6988054607508533,
"acc_stderr": 0.013406741767847632,
"acc_norm": 0.7167235494880546,
"acc_norm_stderr": 0.013167478735134575
},
"harness|hellaswag|10": {
"acc": 0.7187811192989444,
"acc_stderr": 0.004486752200430352,
"acc_norm": 0.8824935271858195,
"acc_norm_stderr": 0.0032136470410029467
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138208,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908352,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908352
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033484,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849929,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849929
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374307,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374307
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250458,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250458
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992002,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992002
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46033519553072627,
"acc_stderr": 0.01666979959211203,
"acc_norm": 0.46033519553072627,
"acc_norm_stderr": 0.01666979959211203
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.0127397115540457,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.0127397115540457
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5385556915544676,
"mc1_stderr": 0.017451384104637452,
"mc2": 0.6812168345875693,
"mc2_stderr": 0.015141577387322332
},
"harness|winogrande|5": {
"acc": 0.8547750591949487,
"acc_stderr": 0.009902153904760824
},
"harness|gsm8k|5": {
"acc": 0.6535253980288097,
"acc_stderr": 0.013107179054313403
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sbad/biographySummaries | ---
license: cc
---
Hi, this is my first dataset. If you know how to make it better, please leave a comment |
liuyanchen1015/MULTI_VALUE_rte_if_would | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 31188
num_examples: 66
- name: train
num_bytes: 30770
num_examples: 64
download_size: 51882
dataset_size: 61958
---
# Dataset Card for "MULTI_VALUE_rte_if_would"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xwjzds/pretrain_sts_similarity | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 8335942
num_examples: 41191
download_size: 5350395
dataset_size: 8335942
---
Dataset Card for Sentence Paraphase Collections
Dataset Description Repository: Paper: DeTiME: Diffusion-Enhanced Topic Modeling using Encoder-decoder based LLM https://arxiv.org/abs/2310.15296
Leaderboard: Point of Contact: Weijie Xu
Dataset Summary Sentence_Paraphase is a combination of sentences paraphase tasks from various sources such as paraphase using ChatGPT, Paraphrase Adversaries from Word Scrambling (PAWS) and STS benchmark. We filtered out pairs that are detected as non english, too short or not have high similarity score.
Category Count Paraphrase 223241
Dataset Structure Data Instances An example of data as follows: {'input': 'U.S. prosecutors have arrested more than 130 individuals and have seized more than $17 million in a continuing crackdown on Internet fraud and abuse.', 'output': 'More than 130 people have been arrested and $17 million worth of property seized in an Internet fraud sweep announced Friday by three U.S. government agencies.'}
Data Fields The data fields are as follows:
input and output are paraphrase of a sentence or paragraph.
Dataset Creation Curation Rationale [More Information Needed]
Source Data Initial Data Collection and Normalization [More Information Needed]
Who are the source language producers? [More Information Needed]
Annotations Annotation process [More Information Needed]
Who are the annotators? [More Information Needed]
Personal and Sensitive Information [More Information Needed]
Considerations for Using the Data Social Impact of Dataset [More Information Needed]
Discussion of Biases [More Information Needed]
Other Known Limitations [More Information Needed]
Additional Information Dataset Curators [More Information Needed]
Licensing Information The dataset is available under the Creative Commons NonCommercial (CC BY-NC 4.0).
Citation Information @misc{xu2023detime, title={DeTiME: Diffusion-Enhanced Topic Modeling using Encoder-decoder based LLM}, author={Weijie Xu and Wenxiang Hu and Fanyou Wu and Srinivasan Sengamedu}, year={2023}, eprint={2310.15296}, archivePrefix={arXiv}, primaryClass={cs.CL} } |
Shawt/Shawtsanders | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.